Search Results for author: Nick Dexter

Found 7 papers, 2 papers with code

Learning smooth functions in high dimensions: from sparse polynomials to deep neural networks

no code implementations4 Apr 2024 Ben Adcock, Simone Brugiapaglia, Nick Dexter, Sebastian Moraga

For the latter, there is currently a significant gap between the approximation theory of DNNs and the practical performance of deep learning.

Uncertainty Quantification

A unified framework for learning with nonlinear model classes from arbitrary linear samples

no code implementations25 Nov 2023 Ben Adcock, Juan M. Cardenas, Nick Dexter

In summary, our work not only introduces a unified way to study learning unknown objects from general types of data, but also establishes a series of general theoretical guarantees which consolidate and improve various known results.

Active Learning Generalization Bounds

CS4ML: A general framework for active learning with arbitrary data based on Christoffel functions

no code implementations NeurIPS 2023 Ben Adcock, Juan M. Cardenas, Nick Dexter

Our framework extends the standard setup by allowing for general types of data, rather than merely pointwise samples of the target function.

Active Learning

CAS4DL: Christoffel Adaptive Sampling for function approximation via Deep Learning

1 code implementation25 Aug 2022 Ben Adcock, Juan M. Cardenas, Nick Dexter

In this work, we propose an adaptive sampling strategy, CAS4DL (Christoffel Adaptive Sampling for Deep Learning) to increase the sample efficiency of DL for multivariate function approximation.

Uncertainty Quantification

On efficient algorithms for computing near-best polynomial approximations to high-dimensional, Hilbert-valued functions from limited samples

no code implementations25 Mar 2022 Ben Adcock, Simone Brugiapaglia, Nick Dexter, Sebastian Moraga

On the one hand, there is a well-developed theory of best $s$-term polynomial approximation, which asserts exponential or algebraic rates of convergence for holomorphic functions.

Uncertainty Quantification

Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data

no code implementations11 Dec 2020 Ben Adcock, Simone Brugiapaglia, Nick Dexter, Sebastian Moraga

Such problems are challenging: 1) pointwise samples are expensive to acquire, 2) the function domain is high dimensional, and 3) the range lies in a Hilbert space.

The gap between theory and practice in function approximation with deep neural networks

1 code implementation16 Jan 2020 Ben Adcock, Nick Dexter

Our main conclusion from these experiments is that there is a crucial gap between the approximation theory of DNNs and their practical performance, with trained DNNs performing relatively poorly on functions for which there are strong approximation results (e. g. smooth functions), yet performing well in comparison to best-in-class methods for other functions.

Computational Efficiency Decision Making

Cannot find the paper you are looking for? You can Submit a new open access paper.