DeLighT: Deep and Light-weight Transformer | ✓ Link | 34.7 | | DeLighT | 2020-08-03 |
Incorporating a Local Translation Mechanism into Non-autoregressive Translation | ✓ Link | 32.87 | | CMLM+LAT+4 iterations | 2020-11-12 |
FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow | ✓ Link | 32.35 | | FlowSeq-large (NPD n = 30) | 2019-09-05 |
FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow | ✓ Link | 31.97 | | FlowSeq-large (NPD n=15) | 2019-09-05 |
FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow | ✓ Link | 31.08 | | FlowSeq-large (IWD n = 15) | 2019-09-05 |
Incorporating a Local Translation Mechanism into Non-autoregressive Translation | ✓ Link | 30.74 | | CMLM+LAT+1 iterations | 2020-11-12 |
Convolutional Sequence to Sequence Learning | ✓ Link | 29.9 | | ConvS2S BPE40k | 2017-05-08 |
FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow | ✓ Link | 29.86 | | FlowSeq-large | 2019-09-05 |
Non-Autoregressive Neural Machine Translation | ✓ Link | 29.79 | | NAT +FT + NPD | 2017-11-07 |
Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement | ✓ Link | 29.66 | | Denoising autoencoders (non-autoregressive) | 2018-02-19 |
FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow | ✓ Link | 29.26 | | FlowSeq-base | 2019-09-05 |
The QT21/HimL Combined Machine Translation System | | 28.9 | | GRU BPE90k | 2016-08-01 |
Edinburgh Neural Machine Translation Systems for WMT 16 | ✓ Link | 28.1 | | BiGRU | 2016-06-09 |
A Convolutional Encoder Model for Neural Machine Translation | ✓ Link | 27.8 | | Deep Convolutional Encoder; single-layer decoder | 2016-11-07 |
A Convolutional Encoder Model for Neural Machine Translation | ✓ Link | 27.5 | | BiLSTM | 2016-11-07 |
Phrase-Based & Neural Unsupervised Machine Translation | ✓ Link | 25.13 | | PBSMT + NMT | 2018-04-20 |
Phrase-Based & Neural Unsupervised Machine Translation | ✓ Link | 21.33 | | Unsupervised PBSMT | 2018-04-20 |
Phrase-Based & Neural Unsupervised Machine Translation | ✓ Link | 21.18 | | Unsupervised NMT + Transformer | 2018-04-20 |
Finetuned Language Models Are Zero-Shot Learners | ✓ Link | 20.5 | | FLAN 137B (few-shot, k=9) | 2021-09-03 |
Finetuned Language Models Are Zero-Shot Learners | ✓ Link | 18.9 | | FLAN 137B (zero-shot) | 2021-09-03 |
TextBox 2.0: A Text Generation Library with Pre-trained Language Models | ✓ Link | | 37.2 | BART (TextBox 2.0) | 2022-12-26 |