Exponential Moving Average Normalization for Self-supervised and Semi-supervised Learning

We present a plug-in replacement for batch normalization (BN) called exponential moving average normalization (EMAN), which improves the performance of existing student-teacher based self- and semi-supervised learning techniques. Unlike the standard BN, where the statistics are computed within each batch, EMAN, used in the teacher, updates its statistics by exponential moving average from the BN statistics of the student. This design reduces the intrinsic cross-sample dependency of BN and enhances the generalization of the teacher. EMAN improves strong baselines for self-supervised learning by 4-6/1-2 points and semi-supervised learning by about 7/2 points, when 1%/10% supervised labels are available on ImageNet. These improvements are consistent across methods, network architectures, training duration, and datasets, demonstrating the general effectiveness of this technique. The code is available at https://github.com/amazon-research/exponential-moving-average-normalization.

PDF Abstract CVPR 2021 PDF CVPR 2021 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Semi-Supervised Image Classification ImageNet - 0.2% labeled data FixMatch w/ EMAN (ResNet-50) ImageNet Top-1 Accuracy 43.6% # 2
Semi-Supervised Image Classification ImageNet - 10% labeled data FixMatch-EMAN Top 1 Accuracy 74% # 24
Semi-Supervised Image Classification ImageNet - 1% labeled data FixMatch-EMAN Top 1 Accuracy 63% # 26

Methods