Search Results for author: Eren C. Kızıldağ

Found 4 papers, 0 papers with code

Self-Regularity of Non-Negative Output Weights for Overparameterized Two-Layer Neural Networks

no code implementations2 Mar 2021 David Gamarnik, Eren C. Kızıldağ, Ilias Zadik

Using a simple covering number argument, we establish that under quite mild distributional assumptions on the input/label pairs; any such network achieving a small training error on polynomially many data necessarily has a well-controlled outer norm.

Generalization Bounds

Neural Networks and Polynomial Regression. Demystifying the Overparametrization Phenomena

no code implementations23 Mar 2020 Matt Emschwiller, David Gamarnik, Eren C. Kızıldağ, Ilias Zadik

Thus a message implied by our results is that parametrizing wide neural networks by the number of hidden nodes is misleading, and a more fitting measure of parametrization complexity is the number of regression coefficients associated with tensorized data.

Generalization Bounds regression

Stationary Points of Shallow Neural Networks with Quadratic Activation Function

no code implementations3 Dec 2019 David Gamarnik, Eren C. Kızıldağ, Ilias Zadik

Next, we show that initializing below this barrier is in fact easily achieved when the weights are randomly generated under relatively weak assumptions.

Inference in High-Dimensional Linear Regression via Lattice Basis Reduction and Integer Relation Detection

no code implementations24 Oct 2019 David Gamarnik, Eren C. Kızıldağ, Ilias Zadik

Using a novel combination of the PSLQ integer relation detection, and LLL lattice basis reduction algorithms, we propose a polynomial-time algorithm which provably recovers a $\beta^*\in\mathbb{R}^p$ enjoying the mixed-support assumption, from its linear measurements $Y=X\beta^*\in\mathbb{R}^n$ for a large class of distributions for the random entries of $X$, even with one measurement $(n=1)$.

regression Relation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.