OpenCodePapers

semantic-textual-similarity-on-sts15

Semantic Textual Similarity
Dataset Link
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
PaperCodeSpearman CorrelationModelNameReleaseDate
Scaling Sentence Embeddings with Large Language Models✓ Link0.9004PromptEOL+CSE+LLaMA-30B2023-07-31
AnglE-optimized Text Embeddings✓ Link0.8956AnglE-LLaMA-13B2023-09-22
Scaling Sentence Embeddings with Large Language Models✓ Link0.8952PromptEOL+CSE+OPT-13B2023-07-31
Scaling Sentence Embeddings with Large Language Models✓ Link0.8951PromptEOL+CSE+OPT-2.7B2023-07-31
AnglE-optimized Text Embeddings✓ Link0.8943AnglE-LLaMA-7B-v22023-09-22
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations✓ Link0.8863Trans-Encoder-RoBERTa-large-cross (unsup.)2021-09-27
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations✓ Link0.8816Trans-Encoder-BERT-large-bi (unsup.)2021-09-27
Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning✓ Link0.8808PromCSE-RoBERTa-large (0.355B)2022-03-14
SimCSE: Simple Contrastive Learning of Sentence Embeddings✓ Link0.8666SimCSE-RoBERTalarge2021-04-18
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations✓ Link0.8577Trans-Encoder-RoBERTa-base-cross (unsup.)2021-09-27
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations✓ Link0.8508Trans-Encoder-BERT-base-bi (unsup.)2021-09-27
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations✓ Link0.8444Trans-Encoder-BERT-base-cross (unsup.)2021-09-27
DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings✓ Link0.8390DiffCSE-BERT-base2022-04-21
DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings✓ Link0.8281DiffCSE-RoBERTa-base2022-04-21
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks✓ Link0.8185SRoBERTa-NLI-large2019-08-27
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders✓ Link0.814Mirror-BERT-base (unsup.)2021-04-16
Generating Datasets with Pretrained Language Models✓ Link0.8049Dino (STSb/)2021-04-15
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders✓ Link0.798Mirror-RoBERTa-base (unsup.)2021-04-16
An Unsupervised Sentence Embedding Method by Mutual Information Maximization✓ Link0.7523IS-BERT-NLI2020-09-25
On the Sentence Embeddings from Pre-trained Language Models✓ Link0.7492BERTlarge-flow (target)2020-11-02