OpenCodePapers

cross-lingual-question-answering-on-mlqa

Question AnsweringCross-Lingual Question Answering
Dataset Link
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
PaperCodeF1EMModelNameReleaseDate
ByT5: Towards a token-free future with pre-trained byte-to-byte models✓ Link71.654.9ByT5 XXL2021-05-28
Rethinking embedding coupling in pre-trained language models✓ Link53.137.3Coupled2020-10-24
Rethinking embedding coupling in pre-trained language models✓ Link53.1Decoupled2020-10-24