no code implementations • 5 Jun 2024 • Andrea Montanari, Kangjie Zhou
Given $d$-dimensional standard Gaussian vectors $\boldsymbol{x}_1,\dots, \boldsymbol{x}_n$, we consider the set of all empirical distributions of its $m$-dimensional projections, for $m$ a fixed constant.
no code implementations • 2 Jan 2024 • Yuchen Wu, Kangjie Zhou
We investigate the power iteration algorithm for the tensor PCA model introduced in Richard and Montanari (2014).
no code implementations • 28 Feb 2023 • Raphaël Berthier, Andrea Montanari, Kangjie Zhou
In this paper, we study the gradient flow dynamics of a wide two-layer neural network in high-dimension, when data are distributed according to a single-index model (i. e., the target function depends on a one-dimensional projection of the covariates).
no code implementations • 7 Nov 2022 • Yuchen Wu, Kangjie Zhou
In this paper, we analyze the dynamics of tensor power iteration from random initialization in the overcomplete regime.
no code implementations • 14 Jun 2022 • Andrea Montanari, Kangjie Zhou
Denoting by $\mathscr{F}_{m, \alpha}$ the set of probability distributions in $\mathbb{R}^m$ that arise as low-dimensional projections in this limit, we establish new inner and outer bounds on $\mathscr{F}_{m, \alpha}$.
no code implementations • 28 Oct 2021 • Andrea Montanari, Yiqiao Zhong, Kangjie Zhou
In the negative perceptron problem we are given $n$ data points $({\boldsymbol x}_i, y_i)$, where ${\boldsymbol x}_i$ is a $d$-dimensional vector and $y_i\in\{+1,-1\}$ is a binary label.