OpenCodePapers

knowledge-distillation-on-coco-2017-val

Knowledge Distillation
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
PaperCodeAP@0.5AP@0.75mAPModelNameReleaseDate
Improving Knowledge Distillation via Regularizing Feature Norm and Direction✓ Link61.8044.9441.03ReviewKD++(T: faster rcnn(resnet101), S:faster rcnn(resnet50))2023-05-26
Improving Knowledge Distillation via Regularizing Feature Norm and Direction✓ Link57.9640.1537.43ReviewKD++(T: faster rcnn(resnet101), S:faster rcnn(resnet18))2023-05-26
Improving Knowledge Distillation via Regularizing Feature Norm and Direction✓ Link55.1837.2134.51ReviewKD++(T: faster rcnn(resnet101), S:faster rcnn(mobilenet-v2))2023-05-26