1 code implementation • 15 Mar 2024 • Abigail Julian, Lars Ruthotto
Over the past decade, reversed Gradient Polarity (RGP) methods have become a popular approach for correcting susceptibility artifacts in Echo-Planar Imaging (EPI).
no code implementations • 8 Jan 2024 • Lars Ruthotto
This short, self-contained article seeks to introduce and survey continuous-time deep learning approaches that are based on neural ordinary differential equations (neural ODEs).
1 code implementation • 13 Nov 2023 • Malvern Madondo, Deepanshu Verma, Lars Ruthotto, Nicholas Au Yong
In this setting, control policies aim to optimize therapeutic outcomes by tailoring the parameters of a DBS system, typically via electrical stimulation, in real time based on the patient's ongoing neuronal activity.
2 code implementations • 25 Oct 2023 • Zheyu Oliver Wang, Ricardo Baptista, Youssef Marzouk, Lars Ruthotto, Deepanshu Verma
PCP-Map models conditional transport maps as the gradient of a partially input convex neural network (PICNN) and uses a novel numerical implementation to increase computational efficiency compared to state-of-the-art alternatives.
1 code implementation • 31 May 2023 • Alex Dunbar, Lars Ruthotto
We propose an alternating minimization heuristic for regression over the space of tropical rational functions with fixed exponents.
1 code implementation • 8 Mar 2023 • Paul Hagemann, Sophie Mildenberger, Lars Ruthotto, Gabriele Steidl, Nicole Tianjiao Yang
We thereby intend to obtain diffusion models that generalize across different resolution levels and improve the efficiency of the training process.
no code implementations • 31 Oct 2022 • Moshe Eliasof, Lars Ruthotto, Eran Treister
Graph Neural Networks (GNNs) are limited in their propagation operators.
no code implementations • 23 Feb 2022 • Kelvin Kan, François-Xavier Aubet, Tim Januschowski, Youngsuk Park, Konstantinos Benidis, Lars Ruthotto, Jan Gasthaus
We propose Multivariate Quantile Function Forecaster (MQF$^2$), a global probabilistic forecasting method constructed using a multivariate quantile function and investigate its application to multi-horizon forecasting.
1 code implementation • 28 Sep 2021 • Elizabeth Newman, Julianne Chung, Matthias Chung, Lars Ruthotto
In the absence of theoretical guidelines or prior experience on similar tasks, this requires solving many training problems, which can be time-consuming and demanding on computational resources.
1 code implementation • 9 Mar 2021 • Lars Ruthotto, Eldad Haber
Developing DGMs has become one of the most hotly researched fields in artificial intelligence in recent years.
1 code implementation • 11 Dec 2020 • Kelvin Kan, James G Nagy, Lars Ruthotto
To close this gap, the hybrid method considered in our paper combines the respective strengths of the two most common forms of regularization: early stopping and weight decay.
1 code implementation • NeurIPS Workshop DLDE 2021 • Moshe Eliasof, Jonathan Ephrath, Lars Ruthotto, Eran Treister
We present a multigrid-in-channels (MGIC) approach that tackles the quadratic growth of the number of parameters with respect to the number of channels in standard convolutional neural networks (CNNs).
1 code implementation • 9 Nov 2020 • Derek Onken, Levon Nurbekyan, Xingjian Li, Samy Wu Fung, Stanley Osher, Lars Ruthotto
Our approach is grid-free and scales efficiently to dimensions where grids become impractical or infeasible.
Optimization and Control
1 code implementation • 26 Jul 2020 • Elizabeth Newman, Lars Ruthotto, Joseph Hart, Bart van Bloemen Waanders
To solve the optimization problem more efficiently, we propose the use of variable projection (VarPro), a method originally designed for separable nonlinear least-squares problems.
no code implementations • 11 Jun 2020 • Jonathan Ephrath, Lars Ruthotto, Eran Treister
We present a multigrid approach that combats the quadratic growth of the number of parameters with respect to the number of channels in standard convolutional neural networks (CNNs).
3 code implementations • 29 May 2020 • Derek Onken, Samy Wu Fung, Xingjian Li, Lars Ruthotto
On five high-dimensional density estimation and generative modeling tasks, OT-Flow performs competitively to state-of-the-art CNFs while on average requiring one-fourth of the number of weights with an 8x speedup in training time and 24x speedup in inference.
1 code implementation • 27 May 2020 • Kelvin Kan, Samy Wu Fung, Lars Ruthotto
We present an interior point method to solve the quadratic projection problem efficiently.
Numerical Analysis Numerical Analysis
1 code implementation • 27 May 2020 • Derek Onken, Lars Ruthotto
Neural ODEs are ordinary differential equations (ODEs) with neural network components.
1 code implementation • 4 Dec 2019 • Lars Ruthotto, Stanley Osher, Wuchen Li, Levon Nurbekyan, Samy Wu Fung
State-of-the-art numerical methods for solving such problems utilize spatial discretization that leads to a curse-of-dimensionality.
no code implementations • 29 Oct 2019 • Jonathan Ephrath, Moshe Eliasof, Lars Ruthotto, Eldad Haber, Eran Treister
In practice, the input data and the hidden features consist of a large number of channels, which in most CNNs are fully coupled by the convolution operators.
no code implementations • 15 Apr 2019 • Jonathan Ephrath, Lars Ruthotto, Eldad Haber, Eran Treister
Convolutional Neural Networks (CNNs) filter the input data using spatial convolution operators with compact stencils.
1 code implementation • 6 Mar 2019 • Eldad Haber, Keegan Lensink, Eran Treister, Lars Ruthotto
Deep convolutional neural networks have revolutionized many machine learning and computer vision tasks, however, some remaining key challenges limit their wider use.
1 code implementation • 27 Jan 2019 • Samy Wu Fung, Sanna Tyrväinen, Lars Ruthotto, Eldad Haber
Solution of the least-squares problem can be be accelerated by pre-computing a factorization or preconditioner, and the separability in the smooth, convex problem can be easily parallelized across examples.
no code implementations • 21 May 2018 • Eldad Haber, Felix Lucka, Lars Ruthotto
Further, we provide numerical examples that demonstrate the potential of our method for training deep neural networks.
1 code implementation • 12 Apr 2018 • Lars Ruthotto, Eldad Haber
In the latter area, PDE-based approaches interpret image data as discretizations of multivariate functions and the output of image processing algorithms as solutions to certain PDEs.
Ranked #69 on Image Classification on STL-10
2 code implementations • 12 Sep 2017 • Bo Chang, Lili Meng, Eldad Haber, Lars Ruthotto, David Begert, Elliot Holtham
In this work, we interpret deep residual networks as ordinary differential equations (ODEs), which have long been studied in mathematics and physics with rich theoretical and empirical success.
Ranked #49 on Image Classification on STL-10
1 code implementation • 28 May 2017 • James Herring, James Nagy, Lars Ruthotto
LAP is most promising for cases when the subproblem corresponding to one of the variables is considerably easier to solve than the other.
4 code implementations • 9 May 2017 • Eldad Haber, Lars Ruthotto
While our new architectures restrict the solution space, several numerical experiments show their competitiveness with state-of-the-art networks.
no code implementations • 13 Mar 2017 • Andreas Mang, Lars Ruthotto
We use an optimal control formulation, in which the velocity field of a hyperbolic PDE needs to be found such that the distance between the final state of the system (the transformed/transported template image) and the observation (the reference image) is minimized.
1 code implementation • 6 Mar 2017 • Eldad Haber, Lars Ruthotto, Elliot Holtham, Seong-Hwan Jun
In this work we establish the relation between optimal control and training deep Convolution Neural Networks (CNNs).
3 code implementations • 23 Jun 2016 • Lars Ruthotto, Eran Treister, Eldad Haber
Estimating parameters of Partial Differential Equations (PDEs) from noisy and indirect measurements often requires solving ill-posed inverse problems.
Mathematical Software