OpenCodePapers
relation-extraction-on-chemprot
Relation Extraction
Dataset Link
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
Show papers without code
Paper
Code
F1
↕
Micro F1
↕
ModelName
ReleaseDate
↕
SciBERT: A Pretrained Language Model for Scientific Text
✓ Link
83.64
SciBert (Finetune)
2019-03-26
BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA
✓ Link
80.0
BioM-BERT
LinkBERT: Pretraining Language Models with Document Links
✓ Link
79.98
79.98
BioLinkBERT (large)
2022-03-29
SciFive: a text-to-text transformer model for biomedical literature
✓ Link
78
SciFive Large
2021-05-28
Improving Biomedical Pretrained Language Models with Knowledge
✓ Link
77.5
KeBioLM
2021-04-21
SciFive: a text-to-text transformer model for biomedical literature
✓ Link
77.40
BioT5X (base)
2021-05-28
BioMegatron: Larger Biomedical Domain Language Model
✓ Link
77.0
BioMegatron
2020-10-12
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
✓ Link
76.46
BioBERT
2019-01-25
Transfer Learning in Biomedical Natural Language Processing: An Evaluation of BERT and ELMo on Ten Benchmarking Datasets
✓ Link
74.4
NCBI_BERT(large) (P)
2019-06-13
SciBERT: A Pretrained Language Model for Scientific Text
✓ Link
73.7
SciBERT (Base Vocab)
2019-03-26
ELECTRAMed: a new pre-trained language representation model for biomedical NLP
✓ Link
72.94
ELECTRAMed
2021-04-19
Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing
✓ Link
77.24
PubMedBERT uncased
2020-07-31
CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters
✓ Link
73.44
CharacterBERT (base, medical)
2020-10-20