LDMol: Text-to-Molecule Diffusion Model with Structurally Informative Latent Space | ✓ Link | 92.6 | 53.3 | 0.20 | 6.750 | 97.3 | 93.1 | 95.0 | | 94.1 | | LDMol | 2024-05-28 |
MolReFlect: Towards Fine-grained In-Context Alignment between Molecules and Texts | | 90.3 | 51.0 | | 11.84 | 92.9 | 81.3 | 86.0 | | 97.7 | | MolReFlect | 2024-11-22 |
BioT5+: Towards Generalized Biological Understanding with IUPAC Integration and Multi-task Tuning | ✓ Link | 87.2 | 52.2 | 0.353 | 12.776 | 90.7 | 77.9 | 83.5 | 57.9 | 100 | 252000000 | BioT5+ | 2024-02-27 |
BioT5: Enriching Cross-modal Integration in Biology with Chemical Knowledge and Natural Language Associations | ✓ Link | 86.7 | 41.3 | .43 | 15.097 | 88.6 | 73.4 | 80.1 | 57.6 | 100 | 252000000 | BioT5 | 2023-10-11 |
Empowering Molecule Discovery for Molecule-Caption Translation with Large Language Models: A ChatGPT Perspective | ✓ Link | 85.7 | 28.0 | 0.41 | 17.14 | 90.3 | 73.9 | 80.5 | 59.3 | 89.9 | None | MolReGPT (GPT-4-0413) | 2023-06-11 |
Translation between Molecules and Natural Language | ✓ Link | 85.4 | 30.2 | 1.20 | 16.07 | 83.4 | 68.4 | 74.6 | 55.4 | 90.5 | 770000000 | MolT5-Large | 2022-04-25 |
Unifying Molecular and Textual Representations via Multi-task Language Modelling | ✓ Link | 85.3 | 32.2 | .05 | 16.87 | 90.1 | 75.7 | 81.6 | | 94.3 | 220000000 | Text+Chem T5-augm base | 2023-01-29 |
Text-Guided Molecule Generation with Diffusion Language Model | ✓ Link | 82.8 | 24.2 | 0.89 | 16.897 | 87.4 | 72.2 | 77.1 | 58.9 | 78.9 | 180000000 | TGM-DLM w/o corr | 2024-02-20 |
Text-Guided Molecule Generation with Diffusion Language Model | ✓ Link | 82.6 | 24.2 | 0.77 | 17.003 | 85.4 | 68.8 | 73.9 | 58.1 | 87.1 | 180000000 | TGM-DLM | 2024-02-20 |
MolFM: A Multimodal Molecular Foundation Model | ✓ Link | 82.2 | 21.0 | | 19.445 | 85.4 | 75.8 | 69.7 | 58.3 | 89.2 | 296200000 | MolFM-Base | 2023-06-06 |
Unifying Molecular and Textual Representations via Multi-task Language Modelling | ✓ Link | 81.5 | 19.1 | 0.06 | 21.78 | 86.4 | 67.2 | 74.4 | | 95.1 | 60000000 | Text+Chem T5-augm small | 2023-01-29 |
Translation between Molecules and Natural Language | ✓ Link | 81.0 | 31.4 | 0.44 | 16.758 | 87.2 | 72.2 | 78.6 | 59.0 | 99.6 | 770000000 | MolT5-Large-HV | 2022-04-25 |
MolFM: A Multimodal Molecular Foundation Model | ✓ Link | 80.3 | 16.9 | | 20.868 | 83.4 | 72.1 | 66.2 | 57.3 | 85.9 | 13620000 | MolFM-Small | 2023-06-06 |
Empowering Molecule Discovery for Molecule-Caption Translation with Large Language Models: A ChatGPT Perspective | ✓ Link | 79.0 | 13.9 | 0.57 | 24.91 | 84.7 | 62.4 | 70.8 | 57.1 | 88.7 | | MolReGPT (GPT-3.5-turbo) | 2023-06-11 |
Translation between Molecules and Natural Language | ✓ Link | 76.9 | 8.1 | 2.18 | 24.458 | 72.1 | 52.9 | 58.8 | 49.6 | 77.2 | 220000000 | MolT5-base | 2022-04-25 |
GIT-Mol: A Multi-modal Large Language Model for Molecular Science with Graph, Image, and Text | ✓ Link | 75.6 | 5.1 | | 26.315 | 73.8 | 51.9 | 58.2 | | 92.8 | | GIT-Mol-caption | 2023-08-14 |
Translation between Molecules and Natural Language | ✓ Link | 75.5 | 7.9 | 2.49 | 25.988 | 70.3 | 51.7 | 56.8 | 48.2 | 72.1 | 60000000 | MolT5-small | 2022-04-25 |
Unifying Molecular and Textual Representations via Multi-task Language Modelling | ✓ Link | 75 | 21.2 | 0.061 | 27.39 | 87.4 | 69.7 | 76.7 | | 79.2 | 220000000 | Text+Chem T5 base | 2023-01-29 |
Unifying Molecular and Textual Representations via Multi-task Language Modelling | ✓ Link | 73.9 | 15.7 | 0.066 | 28.54 | 85.9 | 66 | 73.6 | | 77.6 | 60000000 | Text+Chem T5 small | 2023-01-29 |
MolXPT: Wrapping Molecules with Text for Generative Pre-training | ✓ Link | | 21.5 | 0.45 | | 85.9 | 66.7 | 75.7 | 57.8 | 98.3 | 350000000 | MolXPT | 2023-05-18 |