Paper | Code | MAP | MRR | ModelName | ReleaseDate |
---|---|---|---|---|---|
Pre-training Transformer Models with Sentence-Level Objectives for Answer Sentence Selection | 0.743 | 0.800 | DeBERTa-V3-Large + SSP | 2022-05-20 | |
Pre-training Transformer Models with Sentence-Level Objectives for Answer Sentence Selection | 0.697 | 0.757 | ELECTRA-Base + SSP | 2022-05-20 | |
Paragraph-based Transformer Pre-training for Multi-Sentence Inference | ✓ Link | 0.673 | 0.737 | RoBERTa-Base Joint MSPP | 2022-05-02 |