no code implementations • 26 Feb 2024 • Amanda Olmin, Jakob Lindqvist, Lennart Svensson, Fredrik Lindsten
Noise-contrastive estimation (NCE) is a popular method for estimating unnormalised probabilistic models, such as energy-based models, which are effective for modelling complex data distributions.
no code implementations • 24 Oct 2023 • Filip Ekström Kelvinius, Fredrik Lindsten
The use of a discriminator to guide a diffusion process has previously been used for continuous diffusion models, and in this work we derive ways of using a discriminator together with a pretrained generative model in the discrete case.
1 code implementation • 29 Sep 2023 • Joel Oskarsson, Tomas Landelius, Fredrik Lindsten
The rise of accurate machine learning methods for weather forecasting is creating radical new possibilities for modeling the atmosphere.
1 code implementation • 16 Feb 2023 • Joel Oskarsson, Per Sidén, Fredrik Lindsten
Our TGNN4I model is designed to handle both irregular time steps and partial observations of the graph.
no code implementations • ICLR 2023 • Hariprasath Govindarajan, Per Sidén, Jacob Roll, Fredrik Lindsten
With this interpretation, DINO assumes equal precision for all components when the prototypes are also $L^2$-normalized.
Ranked #24 on Self-Supervised Image Classification on ImageNet
Self-Supervised Image Classification Self-Supervised Learning +1
1 code implementation • ICLR 2021 • David Widmann, Fredrik Lindsten, Dave Zachariah
In the machine learning literature, different measures and statistical tests have been proposed and studied for evaluating the calibration of classification models.
no code implementations • 14 Oct 2022 • Heiko Zimmermann, Fredrik Lindsten, Jan-Willem van de Meent, Christian A. Naesseth
Generative flow networks (GFNs) are a class of models for sequential sampling of composite objects, which approximate a target distribution that is defined in terms of an energy function or a reward.
no code implementations • 13 Oct 2022 • Anna Wigren, Fredrik Lindsten
We provide insights on when each sampler should be used and show that they can be combined to form an efficient PG sampler for a model with strong dependencies between states and parameters.
1 code implementation • 10 Jun 2022 • Joel Oskarsson, Per Sidén, Fredrik Lindsten
We propose a flexible GMRF model for general graphs built on the multi-layer structure of Deep GMRFs, originally proposed for lattice graphs only.
1 code implementation • 18 Apr 2022 • Amanda Olmin, Jakob Lindqvist, Lennart Svensson, Fredrik Lindsten
Annotating data for supervised learning can be costly.
no code implementations • 7 Oct 2021 • Amanda Olmin, Fredrik Lindsten
We find that strictly proper and robust loss functions both offer asymptotic robustness in accuracy, but neither guarantee that the final model is calibrated.
no code implementations • NeurIPS 2020 • Christian A. Naesseth, Fredrik Lindsten, David Blei
Modern variational inference (VI) uses stochastic gradients to avoid intractable expectations, enabling large-scale probabilistic inference in complex models.
1 code implementation • 26 Feb 2020 • Jakob Lindqvist, Amanda Olmin, Fredrik Lindsten, Lennart Svensson
Ensembles of neural networks have been shown to give better performance than single networks, both in terms of predictions and uncertainty estimation.
1 code implementation • ICML 2020 • Per Sidén, Fredrik Lindsten
Gaussian Markov random fields (GMRFs) are probabilistic graphical models widely used in spatial statistics and related fields to model dependencies over spatial structures.
1 code implementation • NeurIPS 2019 • Anna Wigren, Riccardo Sven Risuleo, Lawrence Murray, Fredrik Lindsten
Bayesian inference in state-space models is challenging due to high-dimensional state trajectories.
1 code implementation • NeurIPS 2019 • David Widmann, Fredrik Lindsten, Dave Zachariah
In safety-critical applications a probabilistic model is usually required to be calibrated, i. e., to capture the uncertainty of its predictions accurately.
no code implementations • 21 Oct 2019 • Jan Kudlicka, Lawrence M. Murray, Thomas B. Schön, Fredrik Lindsten
While the variance reducing properties of rejection control are known, there has not been (to the best of our knowledge) any work on unbiased estimation of the marginal likelihood (also known as the model evidence or the normalizing constant) in this type of particle filter.
no code implementations • 25 Sep 2019 • Jalil Taghia, Maria Bånkestad, Fredrik Lindsten, Thomas Schön
Models that output a vector of responses given some inputs, in the form of a conditional mean vector, are at the core of machine learning.
no code implementations • 12 Mar 2019 • Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön
A core problem in statistics and probabilistic machine learning is to compute probability distributions and expectations.
1 code implementation • 19 Feb 2019 • Juozas Vaicenavicius, David Widmann, Carl Andersson, Fredrik Lindsten, Jacob Roll, Thomas B. Schön
Probabilistic classifiers output a probability distribution on target classes rather than just a class prediction.
no code implementations • 4 Feb 2019 • Jalil Taghia, Maria Bånkestad, Fredrik Lindsten, Thomas B. Schön
However, in certain scenarios we are interested in learning structured parameters (predictions) in the form of symmetric positive definite matrices.
2 code implementations • NeurIPS 2018 • Fredrik Lindsten, Jouni Helske, Matti Vihola
Approximate inference in probabilistic graphical models (PGMs) can be grouped into deterministic methods and Monte-Carlo-based methods.
no code implementations • 25 Jun 2018 • Andreas Lindholm, Fredrik Lindsten
By combining stochastic approximation EM and particle Gibbs with ancestor sampling (PGAS), PSAEM obtains superior computational performance and convergence properties compared to plain particle-smoothing-based approximations of the EM algorithm.
1 code implementation • NeurIPS 2019 • Christopher Nemeth, Fredrik Lindsten, Maurizio Filippone, James Hensman
In this paper, we introduce the pseudo-extended MCMC method as a simple approach for improving the mixing of the MCMC sampler for multi-modal posterior distributions.
no code implementations • 7 Mar 2017 • Thomas B. Schön, Andreas Svensson, Lawrence Murray, Fredrik Lindsten
We are concerned with the problem of learning probabilistic models of dynamical systems from measured data.
no code implementations • 6 Feb 2017 • Andreas Svensson, Thomas B. Schön, Fredrik Lindsten
In particular, for learning of unknown parameters in nonlinear state-space models, methods based on the particle filter (a Monte Carlo method) have proven very useful.
1 code implementation • 8 Jan 2017 • Pierre E. Jacob, Fredrik Lindsten, Thomas B. Schön
The method combines two recent breakthroughs: the first is a generic debiasing technique for Markov chains due to Rhee and Glynn, and the second is the introduction of a uniformly ergodic Markov chain for smoothing, the conditional particle filter of Andrieu, Doucet and Holenstein.
Methodology Computation
no code implementations • 29 Dec 2016 • Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön
Sequential Monte Carlo (SMC) methods comprise one of the most successful approaches to approximate Bayesian filtering.
no code implementations • 8 Jul 2016 • Johan Alenlöv, Arnaud Doucet, Fredrik Lindsten
When following a Markov chain Monte Carlo (MCMC) approach to approximate the posterior distribution in this context, one typically either uses MCMC schemes which target the joint posterior of the parameters and some auxiliary latent variables, or pseudo-marginal Metropolis--Hastings (MH) schemes.
1 code implementation • 16 Feb 2016 • Tom Rainforth, Christian A. Naesseth, Fredrik Lindsten, Brooks Paige, Jan-Willem van de Meent, Arnaud Doucet, Frank Wood
We introduce interacting particle Markov chain Monte Carlo (iPMCMC), a PMCMC method based on an interacting pool of standard and conditional sequential Monte Carlo samplers.
no code implementations • 17 Nov 2015 • Johan Dahlin, Fredrik Lindsten, Joel Kronander, Thomas B. Schön
Pseudo-marginal Metropolis-Hastings (pmMH) is a powerful method for Bayesian inference in models where the posterior distribution is analytical intractable or computationally costly to evaluate directly.
no code implementations • 20 Mar 2015 • Thomas B. Schön, Fredrik Lindsten, Johan Dahlin, Johan Wågberg, Christian A. Naesseth, Andreas Svensson, Liang Dai
One of the key challenges in identifying nonlinear and possibly non-Gaussian state space models (SSMs) is the intractability of estimating the system state.
1 code implementation • 12 Feb 2015 • Johan Dahlin, Fredrik Lindsten, Thomas B. Schön
A possible application is parameter inference in the challenging class of SSMs with intractable likelihoods.
1 code implementation • 9 Feb 2015 • Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön
NSMC generalises the SMC framework by requiring only approximate, properly weighted, samples from the SMC proposal distribution, while still resulting in a correct SMC algorithm.
no code implementations • 9 Jan 2015 • Simon Lacoste-Julien, Fredrik Lindsten, Francis Bach
Recently, the Frank-Wolfe optimization algorithm was suggested as a procedure to obtain adaptive quadrature rules for integrals of functions in a reproducing kernel Hilbert space (RKHS) with a potentially faster rate of convergence than Monte Carlo integration (and "kernel herding" was shown to be a special case of this procedure).
no code implementations • 25 Sep 2014 • Andreas Svensson, Thomas B. Schön, Fredrik Lindsten
Jump Markov linear models consists of a finite number of linear state space models and a discrete variable encoding the jumps (or switches) between the different linear models.
3 code implementations • 19 Jun 2014 • Fredrik Lindsten, Adam M. Johansen, Christian A. Naesseth, Bonnie Kirkpatrick, Thomas B. Schön, John Aston, Alexandre Bouchard-Côté
We propose a novel class of Sequential Monte Carlo (SMC) algorithms, appropriate for inference in probabilistic graphical models.
no code implementations • NeurIPS 2014 • Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön
We propose a new framework for how to use sequential Monte Carlo (SMC) algorithms for inference in probabilistic graphical models (PGM).
no code implementations • 3 Jan 2014 • Fredrik Lindsten, Michael. I. Jordan, Thomas B. Schön
Particle Markov chain Monte Carlo (PMCMC) is a systematic way of combining the two main tools used for Monte Carlo statistical inference: sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC).
no code implementations • 17 Dec 2013 • Roger Frigola, Fredrik Lindsten, Thomas B. Schön, Carl E. Rasmussen
Gaussian process state-space models (GP-SSMs) are a very flexible family of models of nonlinear dynamical systems.
no code implementations • 4 Nov 2013 • Johan Dahlin, Fredrik Lindsten
Finally, we use a heuristic procedure to obtain a revised parameter iterate, providing an automatic trade-off between exploration and exploitation of the surrogate model.
no code implementations • 4 Nov 2013 • Johan Dahlin, Fredrik Lindsten, Thomas B. Schön
Particle Metropolis-Hastings (PMH) allows for Bayesian parameter inference in nonlinear state space models by combining Markov chain Monte Carlo (MCMC) and particle filtering.
no code implementations • NeurIPS 2013 • Roger Frigola, Fredrik Lindsten, Thomas B. Schön, Carl E. Rasmussen
State-space models are successfully used in many areas of science, engineering and economics to model time series and dynamical systems.
no code implementations • NeurIPS 2012 • Fredrik Lindsten, Thomas Schön, Michael. I. Jordan
We present a novel method in the family of particle MCMC methods that we refer to as particle Gibbs with ancestor sampling (PG-AS).