# Singular Vectors of Sums of Rectangular Random Matrices and Optimal Estimators of High-Rank Signals: The Extensive Spike Model

Across many disciplines from neuroscience and genomics to machine learning, atmospheric science and finance, the problems of denoising large data matrices to recover signals obscured by noise, and of estimating the structure of these signals, are of fundamental importance. A key to solving these problems lies in understanding how the singular value structure of a signal is deformed by noise. This question has been thoroughly studied in the well-known spiked matrix model, in which data matrices originate from low-rank signals perturbed by additive noise, in an asymptotic limit where the size of these matrices tends to infinity but the signal rank remains finite. We first show, strikingly, that the singular value structure of large finite matrices (of size $\sim 1000$) with even moderate-rank signals, as low as $10$, is not accurately predicted by the finite-rank theory, thereby limiting the application of this theory to real data. To address these deficiencies, we analytically compute how the singular values and vectors of an arbitrary high-rank signal matrix are deformed by additive noise. We next study an asymptotic limit corresponding to an $\textit{extensive}$ spike model, in which the rank of the hidden signal is proportional to the size of the data matrix, while both tend to infinity. We map out the phase diagram of the singular value structure of the extensive spike model as a joint function of signal strength and rank. We further exploit these analytics to derive optimal rotationally invariant denoisers to recover hidden $\textit{high}$-rank signals from data, as well as optimal invariant estimators of the signal covariance structure. Overall, our results provide fundamental theory governing how high-dimensional signals are deformed by additive noise, together with practical formulas for optimal denoising and covariance estimation.

PDF Abstract