Paper | Code | F0.5 | ModelName | ReleaseDate |
---|---|---|---|---|
To Err Is Human, but Llamas Can Learn It Too | ✓ Link | 74.09 | Llama + 1M BT + gold | 2024-03-08 |
Comparative study of models trained on synthetic data for Ukrainian grammatical error correction | ✓ Link | 68.17 | mBART-based model with synthetic data | 2024-05-05 |
A Low-Resource Approach to the Grammatical Error Correction of Ukrainian | ✓ Link | 68.09 | mT5 large + 10M synth | 2023-05-05 |
RedPenNet for Grammatical Error Correction: Outputs to Tokens, Attentions to Spans | ✓ Link | 67.71 | RedPenNet | 2023-09-19 |
GPT-3.5 for Grammatical Error Correction | 27.4 | ChatGPT (zero-shot) | 2024-05-14 |