1 code implementation • 21 Nov 2022 • Timo Kaiser, Lukas Ehmann, Christoph Reinders, Bodo Rosenhahn
We introduce Blind Knowledge Distillation - a novel teacher-student approach for learning with noisy labels by masking the ground truth related teacher output to filter out potentially corrupted knowledge and to estimate the tipping point from generalizing to overfitting.