no code implementations • 5 Oct 2023 • Rahul Parhi, Michael Unser
We investigate the function-space optimality (specifically, the Banach-space optimality) of a large class of shallow neural architectures with multivariate nonlinearities/activation functions.
2 code implementations • 21 Aug 2023 • Alexis Goujon, Sebastian Neumayer, Michael Unser
We propose to learn non-convex regularizers with a prescribed upper bound on their weak-convexity modulus.
no code implementations • 19 Jul 2023 • Yan Liu, Jonathan Dong, Thanh-an Pham, Francois Marelli, Michael Unser
Then, we introduce a calibration algorithm that recovers the unknown system parameters fed into the final 3D iterative reconstruction algorithm for a distortion-free volumetric image.
no code implementations • 18 Jul 2023 • Thanh-an Pham, Emmanuel Soubies, Ferréol Soulez, Michael Unser
We show that structural information can be extracted from single molecule localization microscopy (SMLM) data.
no code implementations • 31 Mar 2023 • Sebastian Neumayer, Lénaïc Chizat, Michael Unser
In supervised learning, the regularization path is sometimes used as a convenient theoretical proxy for the optimization path of gradient descent initialized from zero.
2 code implementations • 22 Nov 2022 • Alexis Goujon, Sebastian Neumayer, Pakshal Bohra, Stanislas Ducotterd, Michael Unser
The emergence of deep-learning-based methods to solve image-reconstruction problems has enabled a significant increase in reconstruction quality.
no code implementations • 11 Nov 2022 • Kay Lächler, Hélène Lajous, Michael Unser, Meritxell Bach Cuadra, Pol del Aguila Pla
In this paper, we sidestep this difficulty by providing a proof of concept of a self-supervised single-volume superresolution framework for T2-weighted FBMRI (SAIR).
no code implementations • 31 Oct 2022 • Selin Aviyente, Alejandro Frangi, Erik Meijering, Arrate Muñoz-Barrutia, Michael Liebling, Dimitri Van De Ville, Jean-Christophe Olivo-Marin, Jelena Kovačević, Michael Unser
The Bio Image and Signal Processing (BISP) Technical Committee (TC) of the IEEE Signal Processing Society (SPS) promotes activities within the broad technical field of biomedical image and signal processing.
1 code implementation • 28 Oct 2022 • Stanislas Ducotterd, Alexis Goujon, Pakshal Bohra, Dimitris Perdios, Sebastian Neumayer, Michael Unser
Lipschitz-constrained neural networks have several advantages over unconstrained ones and can be applied to a variety of problems, making them a topic of attention in the deep learning community.
1 code implementation • 16 Aug 2022 • Mehrsa Pourya, Alexis Goujon, Michael Unser
Rectified linear unit (ReLU) neural networks generate continuous and piecewise-linear (CPWL) mappings and are the state-of-the-art approach for solving regression problems.
no code implementations • 29 Jun 2022 • Michael Unser
By contrast, for the total-variation norm, the solution takes the form of a two-layer neural network with an activation function that is determined by the regularization operator.
no code implementations • 17 Jun 2022 • Alexis Goujon, Arian Etemadi, Michael Unser
We first provide upper and lower bounds on the maximal number of linear regions of a CPWL NN given its depth, width, and the number of linear regions of its activation functions.
no code implementations • 14 Jun 2022 • Pol del Aguila Pla, Sebastian Neumayer, Michael Unser
Robustness and stability of image-reconstruction algorithms have recently come under scrutiny.
1 code implementation • 7 Jun 2022 • Jonathan Dong, Erik Börve, Mushegh Rafayelyan, Michael Unser
Reservoir Computing is a class of Recurrent Neural Networks with internal weights fixed at random.
no code implementations • 28 Apr 2022 • P. del Aguila Pla, Michael Unser
The projection of sample measurements onto a reconstruction space represented by a basis on a regular grid is a powerful and simple approach to estimate a probability density function.
no code implementations • 13 Apr 2022 • Sebastian Neumayer, Alexis Goujon, Pakshal Bohra, Michael Unser
Lipschitz-constrained neural networks have many applications in machine learning.
no code implementations • 18 Mar 2022 • Pakshal Bohra, Thanh-an Pham, Jonathan Dong, Michael Unser
In this work, we present a Bayesian reconstruction framework for nonlinear imaging models where we specify the prior knowledge on the image through a deep generative model.
no code implementations • 18 Mar 2022 • Pakshal Bohra, Pol del Aguila Pla, Jean-François Giovannelli, Michael Unser
We present a statistical framework to benchmark the performance of reconstruction algorithms for linear inverse problems, in particular, neural-network-based methods that require large quantities of training data.
no code implementations • 4 Mar 2022 • Michael Unser
Ridges appear in the theory of neural networks as functional descriptors of the effect of a neuron, with the direction vector being encoded in the linear weights.
no code implementations • 3 Feb 2022 • Icíar LLoréns Jover, Thomas Debarre, Shayan Aziznejad, Michael Unser
We prove that an optimal solution to the inverse problem is a closed curve with spline components.
1 code implementation • 12 Dec 2021 • Shayan Aziznejad, Joaquim Campos, Michael Unser
Our motivation for defining HTV is to assess the complexity of supervised-learning schemes.
no code implementations • NeurIPS Workshop Deep_Invers 2021 • Pakshal Bohra, Alexis Goujon, Dimitris Perdios, Sébastien Emery, Michael Unser
We show that averaged denoising operators built from 1-Lipschitz deep spline networks consistently outperform those built from 1-Lipschitz ReLU networks.
no code implementations • 8 Jul 2021 • Tao Hong, Thanh-an Pham, Eran Treister, Michael Unser
In this work, we introduce instead a Helmholtz-based nonlinear model for inverse scattering.
no code implementations • 24 Mar 2021 • Thomas Debarre, Shayan Aziznejad, Michael Unser
We present a novel framework for the reconstruction of 1D composite signals assumed to be a mixture of two additive components, one sparse and the other smooth, given a finite number of linear measurements.
no code implementations • 26 Oct 2020 • Quentin Denoyelle, Thanh-an Pham, Pol del Aguila Pla, Daniel Sage, Michael Unser
We propose the use of Flat Metric to assess the performance of reconstruction methods for single-molecule localization microscopy (SMLM) in scenarios where the ground-truth is available.
no code implementations • 24 Sep 2020 • Fangshu Yang, Thanh-an Pham, Nathalie Brandenberg, Matthias P. Lutolf, Jianwei Ma, Michael Unser
Our work paves the way to reliable phase imaging of thick and complex samples with QPI.
no code implementations • 17 Jan 2020 • Shayan Aziznejad, Harshit Gupta, Joaquim Campos, Michael Unser
To that end, we first establish a global bound for the Lipschitz constant of neural networks.
1 code implementation • 3 Oct 2019 • Jaejun Yoo, Kyong Hwan Jin, Harshit Gupta, Jerome Yerly, Matthias Stuber, Michael Unser
The key ingredients of our method are threefold: 1) a fixed low-dimensional manifold that encodes the temporal variations of images; 2) a network that maps the manifold into a more expressive latent space; and 3) a convolutional neural network that generates a dynamic series of MRI images from the latent variables and that favors their consistency with the measurements in k-space.
no code implementations • 24 Apr 2019 • Michael Unser, Julien Fageot
In short, the native space for ${\rm L}$ and the (dual) norm $\|\cdot\|_{\mathcal{X}'}$ is the largest space of functions $f: \mathbb{R}^d \to \mathbb{R}$ such that $\|{\rm L} f\|_{\mathcal{X}'}<\infty$, subject to the constraint that the growth-restricted null space of ${\rm L}$be finite-dimensional.
no code implementations • 2 Mar 2019 • Michael Unser
We then use our theorem to retrieve a number of known results in the literature---e. g., the celebrated representer theorem of machine leaning for RKHS, Tikhonov regularization, representer theorems for sparsity promoting functionals, the recovery of spikes---as well as a few new ones.
no code implementations • 14 Jan 2019 • Kyong Hwan Jin, Michael Unser, Kwang Moo Yi
The reconstruction network is trained to give the highest reconstruction quality, given the MCTS sampling pattern.
1 code implementation • 19 Dec 2018 • Emmanuel Soubies, Ferréol Soulez, Michael T. McCann, Thanh-an Pham, Laurène Donati, Thomas Debarre, Daniel Sage, Michael Unser
GlobalBioIm is an open-source MATLAB library for solving inverse problems.
Mathematical Software
no code implementations • 2 Nov 2018 • Shayan Aziznejad, Michael Unser
In this paper, we provide a Banach-space formulation of supervised learning with generalized total-variation (gTV) regularization.
no code implementations • 23 Oct 2018 • Julien Fageot, Virginie Uhlmann, Zsuzsanna Püspöki, Benjamin Beck, Michael Unser, Adrien Depeursinge
We provide a complete pipeline for the detection of patterns of interest in an image.
no code implementations • 12 Jun 2018 • Michael T. McCann, Vincent Andrearczyk, Michael Unser, Adrien Depeursinge
In this work, we propose an algorithm for a rotational version of sparse coding that is based on K-SVD with additional rotation operations.
no code implementations • 15 Mar 2018 • Carsten Haubold, Virginie Uhlmann, Michael Unser, Fred A. Hamprecht
Many computer vision pipelines involve dynamic programming primitives such as finding a shortest path or the minimum energy solution in a tree-shaped probabilistic graphical model.
no code implementations • 26 Feb 2018 • Michael Unser
We propose to optimize the activation functions of a deep neural network by adding a corresponding functional regularization to the cost function.
no code implementations • 6 Feb 2018 • Denis Fortun, Martin Storath, Dennis Rickert, Andreas Weinmann, Michael Unser
Current algorithmic approaches for piecewise affine motion estimation are based on alternating motion segmentation and estimation.
2 code implementations • 11 Oct 2017 • Michael T. McCann, Kyong Hwan Jin, Michael Unser
In this survey paper, we review recent uses of convolution neural networks (CNNs) to solve inverse problems in imaging.
no code implementations • 6 Sep 2017 • Harshit Gupta, Kyong Hwan Jin, Ha Q. Nguyen, Michael T. McCann, Michael Unser
When the projector is replaced with a CNN, we propose a relaxed PGD, which always converges.
no code implementations • ICML 2017 • Pedram Pad, Farnood Salehi, Elisa Celis, Patrick Thiran, Michael Unser
We propose a new statistical dictionary learning algorithm for sparse signals that is based on an $\alpha$-stable innovation model.
1 code implementation • 11 Jul 2017 • Emmanuel Soubies, Thanh-an Pham, Michael Unser
Optical diffraction tomography relies on solving an inverse scattering problem governed by the wave equation.
Computational Engineering, Finance, and Science Numerical Analysis Data Analysis, Statistics and Probability Optics
no code implementations • 16 May 2017 • Ha Q. Nguyen, Emrah Bostan, Michael Unser
We propose a data-driven algorithm for the maximum a posteriori (MAP) estimation of stochastic processes from noisy observations.
no code implementations • 11 Nov 2016 • Kyong Hwan Jin, Michael T. McCann, Emmanuel Froustey, Michael Unser
The starting point of our work is the observation that unrolled iterative methods have the form of a CNN (filtering followed by point-wise non-linearity) when the normal operator (H*H, the adjoint of H times H) of the forward model is a convolution.
no code implementations • 7 Dec 2015 • Zsuzsanna Püspöki, John Paul Ward, Daniel Sage, Michael Unser
In analogy with steerable wavelets, we present a general construction of adaptable tight wavelet frames, with an emphasis on scaling operations.
no code implementations • NeurIPS 2012 • Ulugbek Kamilov, Sundeep Rangan, Michael Unser, Alyson K. Fletcher
We present a method, called adaptive generalized approximate message passing (Adaptive GAMP), that enables joint learning of the statistics of the prior and measurement channel along with estimation of the unknown vector $\xbf$.