OpenCodePapers

data-free-knowledge-distillation-on-squad

Data-free Knowledge Distillation
Dataset Link
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
PaperCodeExact MatchModelNameReleaseDate
GOLD: Generalized Knowledge Distillation via Out-of-Distribution-Guided Language Data Generation75.2GOLD (T5-base)2024-03-28
Prompt2Model: Generating Deployable Models from Natural Language Instructions✓ Link74.4Prompt2Model (T5-base)2023-08-23
ZeroGen: Efficient Zero-shot Learning via Dataset Generation✓ Link69.4ZeroGen (T5-base)2022-02-16
ProGen: Progressive Zero-shot Dataset Generation via In-context Feedback✓ Link68.1ProGen (T5-base)2022-10-22