OpenCodePapers

quantization-on-imagenet

Quantization
Dataset Link
Results over time
Click legend items to toggle metrics. Hover points for model names.
Leaderboard
PaperCodeTop-1 Accuracy (%)Weight bitsActivation bitsModelNameReleaseDate
FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer✓ Link85.0388FQ-ViT (ViT-L)2021-11-27
FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer✓ Link83.3188FQ-ViT (ViT-B)2021-11-27
FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer✓ Link82.9788FQ-ViT (Swin-B)2021-11-27
FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer✓ Link82.7188FQ-ViT (Swin-S)2021-11-27
FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer✓ Link81.2088FQ-ViT (DeiT-B)2021-11-27
FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer✓ Link80.5188FQ-ViT (Swin-T)2021-11-27
FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer✓ Link79.1788FQ-ViT (DeiT-S)2021-11-27
HPTQ: Hardware-Friendly Post Training Quantization✓ Link78.97288Xception W8A82021-09-19
Learned Step Size Quantization✓ Link77.87844ADLIK-MO-ResNet50-W4A42019-02-21
Learned Step Size Quantization✓ Link77.3434ADLIK-MO-ResNet50-W3A42019-02-21
HPTQ: Hardware-Friendly Post Training Quantization✓ Link77.09288EfficientNet-B0 ReLU W8A82021-09-19
Learned Step Size Quantization✓ Link76.744ResNet50-W4A4 (paper)2019-02-21
HMQ: Hardware Friendly Mixed Precision Quantization Block for CNNs✓ Link76.488EfficientNet-B0-W8A82020-07-20
HMQ: Hardware Friendly Mixed Precision Quantization Block for CNNs✓ Link7644EfficientNet-B0-W4A42020-07-20
HMQ: Hardware Friendly Mixed Precision Quantization Block for CNNs✓ Link75.4534ResNet50-W3A42020-07-20
HPTQ: Hardware-Friendly Post Training Quantization✓ Link74.21688EfficientNet-B0 W8A82021-09-19
Multi-Prize Lottery Ticket Hypothesis: Finding Accurate Binary Neural Networks by Pruning A Randomly Weighted Network✓ Link74.03MPT (80) +BN2021-03-17
LSQ+: Improving low-bit quantization through learnable offsets and better initialization✓ Link73.844EfficientNet-W4A42020-04-20
HPTQ: Hardware-Friendly Post Training Quantization✓ Link73.35688DenseNet-121 W8A82021-09-19
LSQ+: Improving low-bit quantization through learnable offsets and better initialization✓ Link71.744MixNet-W4A42020-04-20
FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer✓ Link71.6188FQ-ViT (DeiT-T)2021-11-27
Training Multi-bit Quantized and Binarized Networks with A Learnable Symmetric Quantizer✓ Link71.5UniQ (Ours)2021-04-01
HPTQ: Hardware-Friendly Post Training Quantization✓ Link71.4688MobileNetV2 W8A82021-09-19
HMQ: Hardware Friendly Mixed Precision Quantization Block for CNNs✓ Link70.9MobileNetV22020-07-20
R2 Loss: Range Restriction Loss for Model Compression and Quantization69.794MobileNet-v1 + EWGS + R2Loss2023-03-14
R2 Loss: Range Restriction Loss for Model Compression and Quantization69.64MobileNet-v1 + LSQ + R2Loss2023-03-14
R2 Loss: Range Restriction Loss for Model Compression and Quantization68.4524ResNet-18 + PACT + R2Loss2023-03-14