1 code implementation • 30 Mar 2024 • Keller Jordan
CIFAR-10 is among the most widely used datasets in machine learning, facilitating thousands of research projects per year.
no code implementations • 4 Apr 2023 • Keller Jordan
Typical neural network trainings have substantial variance in test-set performance between repeated runs, impeding hyperparameter comparison and training reproducibility.
1 code implementation • 15 Nov 2022 • Keller Jordan, Hanie Sedghi, Olga Saukh, Rahim Entezari, Behnam Neyshabur
In this paper we look into the conjecture of Entezari et al. (2021) which states that if the permutation invariance of neural networks is taken into account, then there is likely no loss barrier to the linear interpolation between SGD solutions.