BioBERT: a pre-trained biomedical language representation model for biomedical text mining | ✓ Link | 89.71 | BioBERT | 2019-01-25 |
Comparing and combining some popular NER approaches on Biomedical tasks | ✓ Link | 89.6 | SpanModel + SequenceLabelingModel | 2023-05-30 |
SciFive: a text-to-text transformer model for biomedical literature | ✓ Link | 89.39 | SciFive-Base | 2021-05-28 |
Biomedical Named Entity Recognition at Scale | ✓ Link | 89.13 | BLSTM-CNN-Char (SparkNLP) | 2020-11-12 |
Biomedical Named Entity Recognition at Scale | ✓ Link | 89.13 | Spark NLP | 2020-11-12 |
Improving Biomedical Pretrained Language Models with Knowledge | ✓ Link | 89.1 | KeBioLM | 2021-04-21 |
Improving Named Entity Recognition by External Context Retrieving and Cooperative Learning | ✓ Link | 88.96 | CL-KL | 2021-05-08 |
Improving Biomedical Named Entity Recognition with Syntactic Information | ✓ Link | 88.77 | BioKMNER + BioBERT | 2020-11-25 |
LinkBERT: Pretraining Language Models with Document Links | ✓ Link | 88.76 | BioLinkBERT (large) | 2022-03-29 |
On the Effectiveness of Compact Biomedical Transformers | ✓ Link | 88.67 | CompactBioBERT | 2022-09-07 |
Learning A Unified Named Entity Tagger From Multiple Partially Annotated Corpora For Efficient Adaptation | ✓ Link | 88.6 | STM | 2019-09-25 |
BERN2: an advanced neural biomedical named entity recognition and normalization tool | ✓ Link | 88.6 | BERN2 | 2022-01-06 |
A Neural Named Entity Recognition and Multi-Type Normalization Tool for Biomedical Text Mining | ✓ Link | 88.3 | BERN | 2019-06-04 |
On the Effectiveness of Compact Biomedical Transformers | ✓ Link | 87.93 | DistilBioBERT | 2022-09-07 |
A Robust and Domain-Adaptive Approach for Low-Resource Named Entity Recognition | ✓ Link | 87.89 | RDANER | 2021-01-02 |
Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing | ✓ Link | 87.82 | PubMedBERT uncased | 2020-07-31 |
BioMegatron: Larger Biomedical Domain Language Model | ✓ Link | 87.8 | BioMegatron BERT-cased | 2020-10-12 |
On the Effectiveness of Compact Biomedical Transformers | ✓ Link | 87.61 | BioDistilBERT | 2022-09-07 |
ELECTRAMed: a new pre-trained language representation model for biomedical NLP | ✓ Link | 87.54 | ELECTRAMed | 2021-04-19 |
On the Effectiveness of Compact Biomedical Transformers | ✓ Link | 87.21 | BioMobileBERT | 2022-09-07 |
SciBERT: A Pretrained Language Model for Scientific Text | ✓ Link | 86.88 | SciBERT (Base Vocab) | 2019-03-26 |
GoLLIE: Annotation Guidelines improve Zero-Shot Information-Extraction | ✓ Link | 86.5 | GoLLIE | 2023-10-05 |
SciBERT: A Pretrained Language Model for Scientific Text | ✓ Link | 86.45 | SciBERT (SciVocab) | 2019-03-26 |
Focusing on Potential Named Entities During Active Label Acquisition | ✓ Link | 84.5 | BERT-CRF | 2021-11-06 |
Evaluation of large language model performance on the Biomedical Language Understanding and Reasoning Benchmark | | 65.98 | GPT-4 | 2024-05-17 |
NuNER: Entity Recognition Encoder Pre-training via LLM-Annotated Data | ✓ Link | 61.1 | NuNER Zero Span | 2024-02-23 |