OpenCodePapers
cross-lingual-question-answering-on-tydiqa
Question Answering
Cross-Lingual Question Answering
Dataset Link
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
Show papers without code
Paper
Code
EM
↕
F1
↕
ModelName
ReleaseDate
↕
ByT5: Towards a token-free future with pre-trained byte-to-byte models
✓ Link
81.9
ByT5 (fine-tuned)
2021-05-28
Transcending Scaling Laws with 0.1% Extra Compute
78.4
88.5
U-PaLM 62B (fine-tuned)
2022-10-20
Scaling Instruction-Finetuned Language Models
✓ Link
68.3
Flan-U-PaLM 540B (direct-prompting)
2022-10-20
Scaling Instruction-Finetuned Language Models
✓ Link
67.8
Flan-PaLM 540B (direct-prompting)
2022-10-20
ByT5: Towards a token-free future with pre-trained byte-to-byte models
✓ Link
60.0
75.3
ByT5 XXL
2021-05-28
Transcending Scaling Laws with 0.1% Extra Compute
54.6
U-PaLM-540B (CoT)
2022-10-20
PaLM: Scaling Language Modeling with Pathways
✓ Link
52.9
PaLM-540B (CoT)
2022-04-05
Rethinking embedding coupling in pre-trained language models
✓ Link
42.8
58.1
Decoupled
2020-10-24
PaLM 2 Technical Report
✓ Link
73.6
PaLM 2-L (one-shot)
2023-05-17
PaLM 2 Technical Report
✓ Link
73.3
PaLM 2-S (one-shot)
2023-05-17
PaLM 2 Technical Report
✓ Link
73.3
PaLM 2-M (one-shot)
2023-05-17