no code implementations • 23 May 2023 • Manuel Kunz, Stefan Birr, Mones Raslan, Lei Ma, Zhen Li, Adele Gouttes, Mateusz Koren, Tofigh Naghibi, Johannes Stephan, Mariia Bulycheva, Matthias Grzeschik, Armin Kekić, Michael Narodovitch, Kashif Rasul, Julian Sieber, Tim Januschowski
These include the volume of data, the irregularity, the high amount of turn-over in the catalog and the fixed inventory assumption.
no code implementations • 9 Jul 2020 • Ingo Gühring, Mones Raslan, Gitta Kutyniok
In this review paper, we give a comprehensive overview of the large variety of approximation results for neural networks.
1 code implementation • 25 Apr 2020 • Moritz Geist, Philipp Petersen, Mones Raslan, Reinhold Schneider, Gitta Kutyniok
Here, approximation theory predicts that the performance of the model should depend only very mildly on the dimension of the parameter space and is determined by the intrinsic dimension of the solution manifold of the parametric partial differential equation.
no code implementations • 31 Mar 2019 • Gitta Kutyniok, Philipp Petersen, Mones Raslan, Reinhold Schneider
We derive upper bounds on the complexity of ReLU neural networks approximating the solution maps of parametric partial differential equations.
no code implementations • 22 Jun 2018 • Philipp Petersen, Mones Raslan, Felix Voigtlaender
We analyze the topological properties of the set of functions that can be implemented by neural networks of a fixed size.
General Topology Functional Analysis 54H99, 68T05, 52A30