OpenCodePapers

knowledge-distillation-on-cifar-100

Knowledge Distillation
Dataset Link
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
PaperCodeTop-1 Accuracy (%)ModelNameReleaseDate
Understanding the Role of the Projector in Knowledge Distillation✓ Link79.86SRD (T:resnet-32x4, S:shufflenet-v2)2023-03-20
Logit Standardization in Knowledge Distillation✓ Link78.76shufflenet-v2(T:resnet-32x4, S:shufflenet-v2)2024-03-03
MV-MR: multi-views and multi-representations for self-supervised learning and knowledge distillation✓ Link78.6MV-MR (T: CLIP/ViT-B-16 S: resnet50)2023-03-21
Logit Standardization in Knowledge Distillation✓ Link78.28resnet8x4 (T: resnet32x4 S: resnet8x4)2024-03-03
Knowledge Distillation with the Reused Teacher Classifier✓ Link78.08resnet8x4 (T: resnet32x4 S: resnet8x4 [modified])2022-03-26
Improving Knowledge Distillation via Regularizing Feature Norm and Direction✓ Link77.93ReviewKD++(T:resnet-32x4, S:shufflenet-v2)2023-05-26
Improving Knowledge Distillation via Regularizing Feature Norm and Direction✓ Link77.68ReviewKD++(T:resnet-32x4, S:shufflenet-v1)2023-05-26
LumiNet: The Bright Side of Perceptual Knowledge Distillation✓ Link77.50resnet8x4 (T: resnet32x4 S: resnet8x4)2023-10-05
Information Theoretic Representation Distillation✓ Link76.68resnet8x4 (T: resnet32x4 S: resnet8x4)2021-12-01
Knowledge Distillation from A Stronger Teacher✓ Link76.31resnet8x4 (T: resnet32x4 S: resnet8x4)2022-05-21
Improving Knowledge Distillation via Regularizing Feature Norm and Direction✓ Link76.28DKD++(T:resnet-32x4, S:resnet-8x4)2023-05-26
Wasserstein Contrastive Representation Distillation76.15resnet8x4 (T: resnet32x4 S: resnet8x4)2020-12-15
Improving Knowledge Distillation via Regularizing Feature Norm and Direction✓ Link75.66ReviewKD++(T:WRN-40-2, S:WRN-40-1)2023-05-26
Distilling Knowledge via Knowledge Review✓ Link75.63resnet8x4 (T: resnet32x4 S: resnet8x4)2021-04-19
Contrastive Representation Distillation✓ Link75.51resnet8x4 (T: resnet32x4 S: resnet8x4)2019-10-23
Information Theoretic Representation Distillation✓ Link74.93vgg8 (T:vgg13 S:vgg8)2021-12-01
Distilling Knowledge via Knowledge Review✓ Link74.84vgg8 (T:vgg13 S:vgg8)2021-04-19
Wasserstein Contrastive Representation Distillation74.72vgg8 (T:vgg13 S:vgg8)2020-12-15
Contrastive Representation Distillation✓ Link74.29vgg8 (T:vgg13 S:vgg8)2019-10-23
Distilling the Knowledge in a Neural Network✓ Link73.33resnet8x4 (T: resnet32x4 S: resnet8x4)2015-03-09
Distilling the Knowledge in a Neural Network✓ Link72.98vgg8 (T:vgg13 S:vgg8)2015-03-09
Improving Knowledge Distillation via Regularizing Feature Norm and Direction✓ Link72.53KD++(T:resnet56, S:resnet20)2023-05-26
Information Theoretic Representation Distillation✓ Link71.99resnet110 (T:resnet110 S:resnet20)2021-12-01
Wasserstein Contrastive Representation Distillation71.88resnet110 (T:resnet110 S:resnet20)2020-12-15
Contrastive Representation Distillation✓ Link71.56resnet110 (T:resnet110 S:resnet20)2019-10-23
Improving Knowledge Distillation via Regularizing Feature Norm and Direction✓ Link70.82DKD++(T:resnet50, S:mobilenetv2)2023-05-26
Distilling the Knowledge in a Neural Network✓ Link70.67resnet110 (T:resnet110 S:resnet20)2015-03-09