Paper | Code | Smoothed BLEU-4 | ModelName | ReleaseDate |
---|---|---|---|---|
CodeTrans: Towards Cracking the Language of Silicon's Code Through Self-Supervised Deep Learning and High Performance Computing | ✓ Link | 26.23 | CodeTrans-MT-Base | 2021-04-06 |
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 21.32 | CodeBERT (MLM+RTD) | 2020-02-19 |
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 21 | CodeBERT (MLM) | 2020-02-19 |
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 20.71 | pre-train w/ code only | 2020-02-19 |
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 20.25 | CodeBERT (RTD) | 2020-02-19 |
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 19.9 | RoBERTa | 2020-02-19 |
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 18.4 | seq2seq | 2020-02-19 |
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 18.25 | Transformer | 2020-02-19 |