Paper | Code | Accuracy | MCC | ModelName | ReleaseDate |
---|---|---|---|---|---|
An end-to-end attention-based approach for learning on graphs | ✓ Link | 94.800±0.424 | 0.935±0.005 | ESA (Edge set attention, no positional encodings) | 2024-02-16 |
Unlocking the Potential of Classic GNNs for Graph-level Tasks: Simple Architectures Meet Excellence | ✓ Link | 94.600±0.570 | GatedGCN+ | 2025-02-13 | |
Exphormer: Sparse Transformers for Graphs | ✓ Link | 94.02±0.209 | Exphormer | 2023-03-10 | |
Recipe for a General, Powerful, Scalable Graph Transformer | ✓ Link | 93.36 ± 0.6 | GPS | 2022-05-25 |