OpenCodePapers
link-prediction-on-wikidata5m
Link Prediction
Dataset Link
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
Show papers without code
Paper
Code
MRR
↕
Hits@10
↕
Hits@1
↕
Hits@3
↕
ModelName
ReleaseDate
↕
MoCoKGC: Momentum Contrast Entity Encoding for Knowledge Graph Completion
0.490
0.591
0.435
0.517
MoCoKGC
2024-11-12
Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction
✓ Link
0.426
0.46
0.406
0.44
KGT5-context + Description
2023-05-22
Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction
✓ Link
0.381
0.422
0.357
0.397
KGT5 + Description
2023-05-22
Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction
✓ Link
0.378
0.427
0.35
0.396
KGT5-context
2023-05-22
SimKGC: Simple Contrastive Knowledge Graph Completion with Pre-trained Language Models
✓ Link
0.358
0.441
0.313
0.376
SimKGC + Description
2022-03-04
Sequence-to-Sequence Knowledge Graph Completion and Question Answering
✓ Link
0.336
0.426
0.282
0.362
KGT5 ComplEx Ensemble
2022-03-19
Parallel Training of Knowledge Graph Embedding Models: A Comparison of Techniques
✓ Link
0.308
0.398
0.255
ComplEx
2021-11-01
Sequence-to-Sequence Knowledge Graph Completion and Question Answering
✓ Link
0.300
0.365
0.267
0.318
KGT5
2022-03-19
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
✓ Link
0.296
0.377
0.252
0.317
SimplE
2019-11-13
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
✓ Link
0.29
0.39
0.234
0.322
RotatE
2019-11-13
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
✓ Link
0.281
0.373
0.228
0.310
ComplEx
2019-11-13
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
✓ Link
0.253
0.392
0.17
0.311
TransE
2019-11-13
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
✓ Link
0.253
0.334
0.208
0.278
DistMult
2019-11-13
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
✓ Link
0.210
0.277
0.173
0.224
KEPLER-Wiki-rel
2019-11-13