Paper | Code | Accuracy | ModelName | ReleaseDate |
---|---|---|---|---|
Clarify Confused Nodes via Separated Learning | ✓ Link | 90.81 ± 0.46 | NCGCN | 2023-06-04 |
Clarify Confused Nodes via Separated Learning | ✓ Link | 90.43 ± 0.72 | NCSAGE | 2023-06-04 |
Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework | ✓ Link | 85.5% | CPF-ind-GAT | 2021-03-04 |
Towards Deeper Graph Neural Networks | ✓ Link | 84.5 ± 1.2 | DAGNN (Ours) | 2020-07-18 |
Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation | ✓ Link | 83.03± 1.87% | GLNN | 2021-10-17 |