OpenCodePapers

cross-lingual-question-answering-on-tydiqa

Question AnsweringCross-Lingual Question Answering
Dataset Link
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
PaperCodeEMF1ModelNameReleaseDate
ByT5: Towards a token-free future with pre-trained byte-to-byte models✓ Link81.9ByT5 (fine-tuned)2021-05-28
Transcending Scaling Laws with 0.1% Extra Compute78.488.5U-PaLM 62B (fine-tuned)2022-10-20
Scaling Instruction-Finetuned Language Models✓ Link68.3Flan-U-PaLM 540B (direct-prompting)2022-10-20
Scaling Instruction-Finetuned Language Models✓ Link67.8Flan-PaLM 540B (direct-prompting)2022-10-20
ByT5: Towards a token-free future with pre-trained byte-to-byte models✓ Link60.075.3ByT5 XXL2021-05-28
Transcending Scaling Laws with 0.1% Extra Compute54.6U-PaLM-540B (CoT)2022-10-20
PaLM: Scaling Language Modeling with Pathways✓ Link52.9PaLM-540B (CoT)2022-04-05
Rethinking embedding coupling in pre-trained language models✓ Link42.858.1Decoupled2020-10-24
PaLM 2 Technical Report✓ Link73.6PaLM 2-L (one-shot)2023-05-17
PaLM 2 Technical Report✓ Link73.3PaLM 2-S (one-shot)2023-05-17
PaLM 2 Technical Report✓ Link73.3PaLM 2-M (one-shot)2023-05-17