no code implementations • 26 May 2023 • Felix Jimenez, Matthias Katzfuss
For regression tasks, standard Gaussian processes (GPs) provide natural uncertainty quantification, while deep neural networks (DNNs) excel at representation learning.
1 code implementation • 30 Jan 2023 • Jian Cao, Myeongjong Kang, Felix Jimenez, Huiyan Sang, Florian Schafer, Matthias Katzfuss
To achieve scalable and accurate inference for latent Gaussian processes, we propose a variational approximation based on a family of Gaussian distributions whose covariance matrices have sparse inverse Cholesky (SIC) factors.
no code implementations • 2 Mar 2022 • Felix Jimenez, Matthias Katzfuss
We focus on the use of our warped Vecchia GP in trust-region Bayesian optimization via Thompson sampling.
1 code implementation • 25 Feb 2022 • Jian Cao, Joseph Guinness, Marc G. Genton, Matthias Katzfuss
Gaussian process (GP) regression is a flexible, nonparametric approach to regression that naturally quantifies uncertainty.
1 code implementation • 19 May 2020 • Kyle P Messier, Matthias Katzfuss
Nitrogen dioxide (NO$_2$) is a primary constituent of traffic-related air pollution and has well established harmful environmental and human-health impacts.
Applications
1 code implementation • 1 May 2020 • Matthias Katzfuss, Joseph Guinness, Earl Lawrence
Many scientific phenomena are studied using computer experiments consisting of multiple runs of a computer model while varying the input settings.
1 code implementation • 29 Apr 2020 • Florian Schäfer, Matthias Katzfuss, Houman Owhadi
We propose to compute a sparse approximate inverse Cholesky factor $L$ of a dense covariance matrix $\Theta$ by minimizing the Kullback-Leibler divergence between the Gaussian distributions $\mathcal{N}(0, \Theta)$ and $\mathcal{N}(0, L^{-\top} L^{-1})$, subject to a sparsity constraint.
Numerical Analysis Numerical Analysis Optimization and Control Statistics Theory Computation Statistics Theory
2 code implementations • 18 Jun 2019 • Daniel Zilber, Matthias Katzfuss
Generalized Gaussian processes (GGPs) are highly flexible models that combine latent GPs with potentially non-Gaussian likelihoods from the exponential family.
Methodology Computation
1 code implementation • 8 May 2018 • Matthias Katzfuss, Joseph Guinness, Wenlong Gong
Gaussian processes (GPs) are highly flexible function estimators used for geospatial analysis, nonparametric regression, and machine learning, but they are computationally infeasible for large datasets.
Methodology Computation
1 code implementation • 21 Aug 2017 • Matthias Katzfuss, Joseph Guinness
Gaussian processes (GPs) are commonly used as models for functions, time series, and spatial fields, but they are computationally infeasible for large datasets.
Methodology Computation
3 code implementations • 16 Jul 2015 • Matthias Katzfuss
The M-RA process is specified as a linear combination of basis functions at multiple levels of spatial resolution, which can capture spatial structure from very fine to very large scales.
Methodology Computation
no code implementations • 10 Apr 2012 • Matthias Katzfuss
With the proliferation of modern high-resolution measuring instruments mounted on satellites, planes, ground-based vehicles and monitoring stations, a need has arisen for statistical methods suitable for the analysis of large spatial datasets observed on large spatial domains.
Methodology Applications Computation