Paper | Code | Smoothed BLEU-4 | ModelName | ReleaseDate |
---|---|---|---|---|
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 25.61 | Transformer | 2020-02-19 |
CodeTrans: Towards Cracking the Language of Silicon's Code Through Self-Supervised Deep Learning and High Performance Computing | ✓ Link | 18.98 | CodeTrans-TF-Large | 2021-04-06 |
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 9.54 | CodeBERT (MLM+RTD) | 2020-02-19 |
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 8.73 | CodeBERT (RTD) | 2020-02-19 |
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 8.51 | CodeBERT (MLM) | 2020-02-19 |
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 8.3 | pre-train w/ code only | 2020-02-19 |
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 6.88 | seq2seq | 2020-02-19 |
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 5.72 | RoBERTa | 2020-02-19 |