Robust Temporal Ensembling for Learning with Noisy Labels

29 Sep 2021  ·  Abel Brown, Benedikt Schifferer, Robert DiPietro ·

Successful training of deep neural networks with noisy labels is an essential capability as most real-world datasets contain some amount of mislabeled data. Left unmitigated, label noise can sharply degrade typical supervised learning approaches. In this paper, we present robust temporal ensembling (RTE), which combines robust loss with semi-supervised regularization methods to achieve noise-robust learning. We demonstrate that RTE achieves state-of-the-art performance across the CIFAR-10, CIFAR-100, ImageNet, WebVision, and Food-101N datasets, while forgoing the recent trend of label filtering and/or fixing. Finally, we show that RTE also retains competitive corruption robustness to unforeseen input noise using CIFAR-10-C, obtaining a mean corruption error (mCE) of 13.50% even in the presence of an 80% noise ratio, versus 26.9% mCE with standard methods on clean data.

PDF Abstract

Results from the Paper


 Ranked #1 on Image Classification on mini WebVision 1.0 (ImageNet Top-5 Accuracy metric)

     Get a GitHub badge
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Image Classification mini WebVision 1.0 RTE (Inception-ResNet-v2) ImageNet Top-1 Accuracy 80.84 # 2
ImageNet Top-5 Accuracy 97.24 # 1

Methods


No methods listed for this paper. Add relevant methods here