Label-Retrieval-Augmented Diffusion Models for Learning from Noisy Labels | ✓ Link | 75.7% | LRA-diffusion (CC) | 2023-05-31 |
SST: Self-training with Self-adaptive Thresholding for Semi-supervised Learning | | 75.7% | Super-SST (ViT-Small, 5% Labels) | 2025-05-31 |
Learning with Noisy labels via Self-supervised Adversarial Noisy Masking | ✓ Link | 75.63% | SANM (DivideMix) | 2023-02-14 |
Centrality and Consistency: Two-Stage Clean Samples Identification for Learning with Instance-Dependent Noisy Labels | ✓ Link | 75.4% | CC | 2022-07-29 |
Class Prototype-based Cleaner for Label Noise Learning | ✓ Link | 75.40±0.10% | CPC | 2022-12-21 |
Jigsaw-ViT: Learning Jigsaw Puzzles in Vision Transformer | ✓ Link | 75.4% | Jigsaw-ViT+NCT | 2022-07-25 |
Learning advisor networks for noisy image classification | ✓ Link | 75.35% | MFRW | 2022-11-08 |
Knockoffs-SPR: Clean Sample Selection in Learning with Noisy Labels | ✓ Link | 75.20% | Knockoffs-SPR | 2023-01-02 |
Sample Prior Guided Robust Model Learning to Suppress Noisy Labels | ✓ Link | 75.19% | PGDF | 2021-12-02 |
Augmentation Strategies for Learning with Noisy Labels | ✓ Link | 75.11% | AugDesc | 2021-03-03 |
Compressing Features for Learning with Noisy Labels | ✓ Link | 75% | Nested+Co-teaching (ResNet-50) | 2022-06-27 |
SSR: An Efficient and Robust Framework for Learning with Unknown Label Noise | ✓ Link | 74.91 | SSR | 2021-11-22 |
Boosting Co-teaching with Compression Regularization for Label Noise | ✓ Link | 74.9% | NestedCoTeaching | 2021-04-28 |
Early-Learning Regularization Prevents Memorization of Noisy Labels | ✓ Link | 74.81% | ELR+ | 2020-06-30 |
DivideMix: Learning with Noisy Labels as Semi-supervised Learning | ✓ Link | 74.76% | DivideMix | 2020-02-18 |
Cross-to-merge training with class balance strategy for learning with noisy labels | ✓ Link | 74.61% | C2MT | 2024-04-01 |
Contrast to Divide: Self-Supervised Pre-Training for Learning with Noisy Labels | ✓ Link | 74.58 ± 0.15% | ELR+ with C2D (ResNet-50) | 2021-03-25 |
Instance-Dependent Noisy Label Learning via Graphical Modelling | ✓ Link | 74.40% | InstanceGM | 2022-09-02 |
LongReMix: Robust Learning with High Confidence Samples in a Noisy Label Environment | ✓ Link | 74.38% | LongReMix | 2021-03-06 |
FINE Samples for Learning with Noisy Labels | ✓ Link | 74.37% | FINE + DivideMix | 2021-02-23 |
To Smooth or Not? When Label Smoothing Meets Noisy Labels | ✓ Link | 74.24% | Negative Label Smoothing (NLS) | 2021-06-08 |
A Second-Order Approach to Learning with Instance-Dependent Label Noise | ✓ Link | 74.17% | CAL | 2020-12-22 |
NoiseRank: Unsupervised Label Noise Reduction with Dependence Models | | 73.82% | NoiseRank | 2020-03-15 |
Which Strategies Matter for Noisy Label Classification? Insight into Loss and Uncertainty | | 73.8% | FOCI | 2020-08-14 |
Meta-Weight-Net: Learning an Explicit Mapping For Sample Weighting | ✓ Link | 73.72% | MW-Net | 2019-02-20 |
Probabilistic End-to-end Noise Correction for Learning with Noisy Labels | ✓ Link | 73.49% | PENCIL | 2019-03-19 |
Learning to Learn from Noisy Labeled Data | ✓ Link | 73.47% | MLNT | 2018-12-13 |
Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels | ✓ Link | 73.39% | HOC | 2021-02-10 |
Contrastive Learning Improves Model Robustness Under Label Noise | ✓ Link | 73.36% | MAE (SimCLR) | 2021-04-19 |
Contrastive Learning Improves Model Robustness Under Label Noise | ✓ Link | 73.35% | Generalized CE (SimCLR) | 2021-04-19 |
Derivative Manipulation for General Example Weighting | ✓ Link | 73.3% | DM | 2019-05-27 |
Contrastive Learning Improves Model Robustness Under Label Noise | ✓ Link | 73.27% | CCE (SimCLR) | 2021-04-19 |
Learning with Instance-Dependent Label Noise: A Sample Sieve Approach | ✓ Link | 73.24% | CORES2 | 2020-10-05 |
IMAE for Noise-Robust Learning: Mean Absolute Error Does Not Treat Examples Equally and Gradient Magnitude's Variance Matters | ✓ Link | 73.2% | IMAE | 2019-03-28 |
When Optimizing $f$-divergence is Robust with Label Noise | ✓ Link | 73.09% | Robust f-divergence | 2020-11-07 |
Safeguarded Dynamic Label Regression for Generalized Noisy Supervision | ✓ Link | 73.07% | LCCN | 2019-03-06 |
L_DMI: A Novel Information-theoretic Loss Function for Training Deep Nets Robust to Label Noise | | 72.46% | DMI | 2019-12-01 |
L_DMI: An Information-theoretic Noise-robust Loss Function | ✓ Link | 72.46% | DMI | 2019-09-08 |
Adaptive Sample Selection for Robust Learning under Label Noise | ✓ Link | 72.28% | BARE | 2021-06-29 |
Joint Optimization Framework for Learning with Noisy Labels | ✓ Link | 72.23% | Joint Opt. | 2018-03-30 |
Error-Bounded Correction of Noisy Labels | ✓ Link | 71.74% | LRT | 2020-11-19 |
Scalable Penalized Regression for Noise Detection in Learning with Noisy Labels | ✓ Link | 71.16% | SPR | 2022-03-15 |
Masking: A New Perspective of Noisy Supervision | ✓ Link | 71.1% | MASKING | 2018-05-21 |
Symmetric Cross Entropy for Robust Learning with Noisy Labels | ✓ Link | 71.02% | SCE | 2019-08-16 |
Unsupervised Label Noise Modeling and Loss Correction | ✓ Link | 71% | DY | 2019-04-25 |
Beyond Class-Conditional Assumption: A Primary Attempt to Combat Instance-Dependent Label Noise | ✓ Link | 70.63% | SEAL | 2020-12-10 |
Combating noisy labels by agreement: A joint training method with co-regularization | ✓ Link | 70.3% | JoCoR | 2020-03-05 |
Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels | ✓ Link | 70.15% | CoT | 2018-04-18 |
Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels | ✓ Link | 69.75% | GCE | 2018-05-20 |
Dimensionality-Driven Learning with Noisy Labels | ✓ Link | 69.47% | D2L | 2018-06-07 |
Adaptive Sample Selection for Robust Learning under Label Noise | ✓ Link | 68.94% | CCE | 2021-06-29 |