no code implementations • NeurIPS 2021 • Prateek Jain, Suhas S Kowshik, Dheeraj Nagaraj, Praneeth Netrapalli
In this work, we improve existing results for learning nonlinear systems in a number of ways: a) we provide the first offline algorithm that can learn non-linear dynamical systems without the mixing assumption, b) we significantly improve upon the sample complexity of existing results for mixing systems, c) in the much harder one-pass, streaming setting we study a SGD with Reverse Experience Replay ($\mathsf{SGD-RER}$) method, and demonstrate that for mixing systems, it achieves the same sample complexity as our offline algorithm, d) we justify the expansivity assumption by showing that for the popular ReLU link function -- a non-expansive but easy to learn link function with i. i. d.
no code implementations • NeurIPS 2021 • Prateek Jain, Suhas S Kowshik, Dheeraj Nagaraj, Praneeth Netrapalli
Thus, we provide the first -- to the best of our knowledge -- optimal SGD-style algorithm for the classical problem of linear system identification with a first order oracle.