Paper | Code | Overall | Go | Ruby | Python | Java | JS | PHP | ModelName | ReleaseDate |
---|---|---|---|---|---|---|---|---|---|---|
Text and Code Embeddings by Contrastive Pre-Training | ✓ Link | 93.5 | 97.5 | 85.5 | 99.9 | 94.4 | 86.5 | 97.2 | cpt-code M | 2022-01-24 |
Text and Code Embeddings by Contrastive Pre-Training | ✓ Link | 93.4 | 97.7 | 86.3 | 99.8 | 94.0 | 86.0 | 96.7 | cpt-code S | 2022-01-24 |
CodeT5+: Open Code Large Language Models for Code Understanding and Generation | ✓ Link | 77.4 | 92.7 | 78.0 | 75.8 | 76.2 | 71.3 | 70.1 | CodeT5+ 770M | 2023-05-13 |
GraphCodeBERT: Pre-training Code Representations with Data Flow | ✓ Link | 77.4 | 84.1 | 73.2 | 87.9 | 75.7 | 71.1 | 72.5 | GraphCodeBERT | 2020-09-17 |
CodeT5+: Open Code Large Language Models for Code Understanding and Generation | ✓ Link | 77.1 | 92.4 | 77.7 | 75.6 | 76.1 | 70..8 | 69.8 | CodeT5+ 220M | 2023-05-13 |
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | ✓ Link | 76.0 | 69.3 | 70.6 | 84.0 | 86.8 | 74.8 | 70.6 | CodeBERT | 2020-02-19 |