no code implementations • 4 Mar 2024 • Joost A. A. Opschoor, Christoph Schwab
We analyze deep Neural Network emulation rates of smooth functions with point singularities in bounded, polytopal domains $\mathrm{D} \subset \mathbb{R}^d$, $d=2, 3$.
no code implementations • 12 Jan 2024 • Joost A. A. Opschoor, Christoph Schwab, Christos Xenophontos
We prove deep neural network (DNN for short) expressivity rate bounds for solution sets of a model class of singularly perturbed, elliptic two-point boundary value problems, in Sobolev norms, on the bounded interval $(-1, 1)$.
no code implementations • 11 Oct 2023 • Joost A. A. Opschoor, Christoph Schwab
We show expression rates and stability in Sobolev norms of deep feedforward ReLU neural networks (NNs) in terms of the number of parameters defining the NN for continuous, piecewise polynomial functions, on arbitrary, finite partitions $\mathcal{T}$ of a bounded interval $(a, b)$.
no code implementations • 14 Jan 2022 • Marcello Longo, Joost A. A. Opschoor, Nico Disch, Christoph Schwab, Jakob Zech
Our construction and DNN architecture generalizes previous results in that no geometric restrictions on the regular simplicial partitions $\mathcal{T}$ of $\Omega$ are required for DNN emulation.
no code implementations • 23 Oct 2020 • Carlo Marcati, Joost A. A. Opschoor, Philipp C. Petersen, Christoph Schwab
We prove exponential expressivity with stable ReLU Neural Networks (ReLU NNs) in $H^1(\Omega)$ for weighted analytic function classes in certain polytopal domains $\Omega$, in space dimension $d=2, 3$.