no code implementations • 1 Apr 2024 • huan zhang, Yifan Chen, Eric Vanden-Eijnden, Benjamin Peherstorfer
Sequential-in-time methods solve a sequence of training problems to fit nonlinear parametrizations such as neural networks to approximate solution trajectories of partial differential equations over time.
no code implementations • 22 Feb 2024 • Jules Berman, Benjamin Peherstorfer
This work introduces reduced models based on Continuous Low Rank Adaptation (CoLoRA) that pre-train neural networks for a given partial differential equation and then continuously adapt low-rank weights in time to rapidly predict the evolution of solution fields at new physics parameters and new initial conditions.
1 code implementation • 11 Oct 2023 • Paul Schwerdtner, Philipp Schulze, Jules Berman, Benjamin Peherstorfer
This work focuses on the conservation of quantities such as Hamiltonians, mass, and momentum when solution fields of partial differential equations are approximated with nonlinear parametrizations such as deep networks.
2 code implementations • NeurIPS 2023 • Jules Berman, Benjamin Peherstorfer
Training neural networks sequentially in time to approximate solution fields of time-dependent partial differential equations can be beneficial for preserving causality and other physics properties; however, the sequential-in-time training is numerically challenging because training errors quickly accumulate and amplify over time.
no code implementations • 23 Jul 2023 • Aimee Maurais, Terrence Alsup, Benjamin Peherstorfer, Youssef Marzouk
We introduce a multifidelity estimator of covariance matrices formulated as the solution to a regression problem on the manifold of symmetric positive definite matrices.
no code implementations • 27 Jun 2023 • Yuxiao Wen, Eric Vanden-Eijnden, Benjamin Peherstorfer
Training nonlinear parametrizations such as deep neural networks to numerically approximate solutions of partial differential equations is often based on minimizing a loss that includes the residual, which is analytically available in limited settings only.
no code implementations • 19 Feb 2023 • Pawan Goyal, Benjamin Peherstorfer, Peter Benner
While extracting information from data with machine learning plays an increasingly important role, physical laws and other first principles continue to provide critical insights about systems and processes of interest in science and engineering.
1 code implementation • 31 Jan 2023 • Aimee Maurais, Terrence Alsup, Benjamin Peherstorfer, Youssef Marzouk
We introduce a multi-fidelity estimator of covariance matrices that employs the log-Euclidean geometry of the symmetric positive-definite manifold.
no code implementations • 6 Dec 2022 • Terrence Alsup, Tucker Hartland, Benjamin Peherstorfer, Noemi Petra
Multilevel Stein variational gradient descent is a method for particle-based variational inference that leverages hierarchies of surrogate target distributions with varying costs and fidelity to computationally speed up inference.
no code implementations • 2 Dec 2022 • Wayne Isaac Tan Uy, Dirk Hartmann, Benjamin Peherstorfer
Data-driven modeling has become a key building block in computational science and engineering.
1 code implementation • 20 Nov 2022 • Ionut-Gabriel Farcas, Benjamin Peherstorfer, Tobias Neckel, Frank Jenko, Hans-Joachim Bungartz
When training low-fidelity models, the proposed approach takes into account the context in which the learned low-fidelity models will be used, namely for variance reduction in Monte Carlo estimation, which allows it to find optimal trade-offs between training and sampling to minimize upper bounds of the mean-squared errors of the estimators for given computational budgets.
no code implementations • 22 Jul 2022 • Steffen W. R. Werner, Benjamin Peherstorfer
This work introduces a data-driven control approach for stabilizing high-dimensional dynamical systems from scarce data.
1 code implementation • 2 Mar 2022 • Joan Bruna, Benjamin Peherstorfer, Eric Vanden-Eijnden
Neural Galerkin schemes build on the Dirac-Frenkel variational principle to train networks by minimizing the residual sequentially over time, which enables adaptively collecting new training data in a self-informed manner that is guided by the dynamics described by the partial differential equations.
no code implementations • 28 Feb 2022 • Steffen W. R. Werner, Benjamin Peherstorfer
Learning controllers from data for stabilizing dynamical systems typically follows a two step process of first identifying a model and then constructing a controller based on the identified model.
1 code implementation • 9 Aug 2021 • Karl Otness, Arvi Gjoka, Joan Bruna, Daniele Panozzo, Benjamin Peherstorfer, Teseo Schneider, Denis Zorin
Simulating physical systems is a core component of scientific computing, encompassing a wide range of physical domains and applications.
no code implementations • 20 Jul 2021 • Wayne Isaac Tan Uy, Yuepeng Wang, Yuxiao Wen, Benjamin Peherstorfer
Furthermore, the connection between operator inference and projection-based model reduction enables bounding the mean-squared errors of predictions made with the learned models with respect to traditional reduced models.
no code implementations • 6 Jul 2021 • Nihar Sawant, Boris Kramer, Benjamin Peherstorfer
Operator inference learns low-dimensional dynamical-system models with polynomial nonlinear terms from trajectories of high-dimensional physical systems (non-intrusive model reduction).
no code implementations • 5 Apr 2021 • Terrence Alsup, Luca Venturi, Benjamin Peherstorfer
The proposed multilevel Stein variational gradient descent moves most of the iterations to lower, cheaper levels with the aim of requiring only a few iterations on the higher, more expensive levels when compared to the traditional, single-level Stein variational gradient descent variant that uses the highest-level distribution only.
1 code implementation • 1 Mar 2021 • Wayne Isaac Tan Uy, Benjamin Peherstorfer
The core contributions of this work are a data sampling scheme to sample partially observed states from high-dimensional dynamical systems and a formulation of a regression problem to fit the non-Markovian reduced terms to the sampled states.
no code implementations • 22 Oct 2020 • Terrence Alsup, Benjamin Peherstorfer
Thus, there is a trade-off between investing computational resources to improve the accuracy of surrogate models versus simply making more frequent recourse to expensive high-fidelity models; however, this trade-off is ignored by traditional modeling methods that construct surrogate models that are meant to replace high-fidelity models rather than being used together with high-fidelity models.
no code implementations • 28 Jul 2020 • Donsub Rim, Luca Venturi, Joan Bruna, Benjamin Peherstorfer
Classical reduced models are low-rank approximations using a fixed basis designed to achieve dimensionality reduction of large-scale systems.
1 code implementation • 12 May 2020 • Wayne Isaac Tan Uy, Benjamin Peherstorfer
This work derives a residual-based a posteriori error estimator for reduced models learned with non-intrusive model reduction from data of high-dimensional systems governed by linear parabolic partial differential equations with control inputs.
1 code implementation • 22 Feb 2020 • Peter Benner, Pawan Goyal, Boris Kramer, Benjamin Peherstorfer, Karen Willcox
The proposed method learns operators for the linear and polynomially nonlinear dynamics via a least-squares problem, where the given non-polynomial terms are incorporated in the right-hand side.
4 code implementations • 17 Dec 2019 • Elizabeth Qian, Boris Kramer, Benjamin Peherstorfer, Karen Willcox
The lifting map is applied to data obtained by evaluating a model for the original nonlinear system.
BIG-bench Machine Learning Physics-informed machine learning
no code implementations • 30 Sep 2019 • Zlatko Drmač, Benjamin Peherstorfer
Loewner rational interpolation provides a versatile tool to learn low-dimensional dynamical-system models from frequency-response measurements.
2 code implementations • 29 Aug 2019 • Benjamin Peherstorfer
Thus, the learned models are guaranteed to inherit the well-studied properties of reduced models from traditional model reduction.
1 code implementation • 30 Aug 2018 • Benjamin Peherstorfer, Zlatko Drmač, Serkan Gugercin
Numerical experiments with synthetic and diffusion-reaction problems demonstrate the stability of oversampled empirical interpolation in the presence of noise.
Numerical Analysis