Paper | Code | F1 | EM | ModelName | ReleaseDate |
---|---|---|---|---|---|
ByT5: Towards a token-free future with pre-trained byte-to-byte models | ✓ Link | 71.6 | 54.9 | ByT5 XXL | 2021-05-28 |
Rethinking embedding coupling in pre-trained language models | ✓ Link | 53.1 | 37.3 | Coupled | 2020-10-24 |
Rethinking embedding coupling in pre-trained language models | ✓ Link | 53.1 | Decoupled | 2020-10-24 |