OpenCodePapers

relation-extraction-on-chemprot

Relation Extraction
Dataset Link
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
PaperCodeF1Micro F1ModelNameReleaseDate
SciBERT: A Pretrained Language Model for Scientific Text✓ Link83.64SciBert (Finetune)2019-03-26
BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA✓ Link80.0BioM-BERT
LinkBERT: Pretraining Language Models with Document Links✓ Link79.9879.98BioLinkBERT (large)2022-03-29
SciFive: a text-to-text transformer model for biomedical literature✓ Link78SciFive Large2021-05-28
Improving Biomedical Pretrained Language Models with Knowledge✓ Link77.5KeBioLM2021-04-21
SciFive: a text-to-text transformer model for biomedical literature✓ Link77.40BioT5X (base)2021-05-28
BioMegatron: Larger Biomedical Domain Language Model✓ Link77.0BioMegatron2020-10-12
BioBERT: a pre-trained biomedical language representation model for biomedical text mining✓ Link76.46BioBERT2019-01-25
Transfer Learning in Biomedical Natural Language Processing: An Evaluation of BERT and ELMo on Ten Benchmarking Datasets✓ Link74.4NCBI_BERT(large) (P)2019-06-13
SciBERT: A Pretrained Language Model for Scientific Text✓ Link73.7SciBERT (Base Vocab)2019-03-26
ELECTRAMed: a new pre-trained language representation model for biomedical NLP✓ Link72.94ELECTRAMed2021-04-19
Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing✓ Link77.24PubMedBERT uncased2020-07-31
CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters✓ Link73.44CharacterBERT (base, medical)2020-10-20