OpenCodePapers

machine-translation-on-wmt2016-english-german

Machine Translation
Dataset Link
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
PaperCodeBLEU scoreSacreBLEUModelNameReleaseDate
Multi-Agent Dual Learning40.68MADL2019-05-01
Edinburgh Neural Machine Translation Systems for WMT 16✓ Link34.2Attentional encoder-decoder + BPE2016-06-09
Linguistic Input Features Improve Neural Machine Translation✓ Link28.4Linguistic Input Features2016-06-09
DeLighT: Deep and Light-weight Transformer✓ Link28.0DeLighT2020-08-03
Finetuned Language Models Are Zero-Shot Learners✓ Link27.0FLAN 137B (zero-shot)2021-09-03
On the adequacy of untuned warmup for adaptive optimization✓ Link26.7Transformer2019-10-09
Finetuned Language Models Are Zero-Shot Learners✓ Link26.1FLAN 137B (few-shot, k=11)2021-09-03
Exploiting Semantics in Neural Machine Translation with Graph Convolutional Networks24.9BiRNN + GCN (Syn + Sem)2018-04-23
Unsupervised Statistical Machine Translation✓ Link 18.23SMT + iterative backtranslation (unsupervised)2018-09-04
Unsupervised Neural Machine Translation with Weight Sharing✓ Link10.86Unsupervised NMT + weight-sharing2018-04-24
Unsupervised Machine Translation Using Monolingual Corpora Only✓ Link9.64Unsupervised S2S with attention2017-10-31
Exploiting Monolingual Data at Scale for Neural Machine Translation40.9Exploiting Mono at Scale (single)2019-11-01