Uncertainty Quantification
509 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Uncertainty Quantification
Libraries
Use these libraries to find Uncertainty Quantification models and implementationsMost implemented papers
Fast and Uncertainty-Aware Directional Message Passing for Non-Equilibrium Molecules
Many important tasks in chemistry revolve around molecules during reactions.
Uncertainty Sets for Image Classifiers using Conformal Prediction
Convolutional image classifiers can achieve high predictive accuracy, but quantifying their uncertainty remains an unresolved challenge, hindering their deployment in consequential settings.
Simple and Principled Uncertainty Estimation with Deterministic Deep Learning via Distance Awareness
Bayesian neural networks (BNN) and deep ensembles are principled approaches to estimate the predictive uncertainty of a deep learning model.
Rule-based Bayesian regression
We introduce a novel rule-based approach for handling regression problems.
COMBO: Conservative Offline Model-Based Policy Optimization
We overcome this limitation by developing a new model-based offline RL algorithm, COMBO, that regularizes the value function on out-of-support state-action tuples generated via rollouts under the learned model.
Deep Deterministic Uncertainty: A Simple Baseline
Reliable uncertainty from deterministic single-forward pass models is sought after because conventional methods of uncertainty quantification are computationally expensive.
A Gentle Introduction to Conformal Prediction and Distribution-Free Uncertainty Quantification
Conformal prediction is a user-friendly paradigm for creating statistically rigorous uncertainty sets/intervals for the predictions of such models.
Deep Uncertainty Quantification: A Machine Learning Approach for Weather Forecasting
We cast the weather forecasting problem as an end-to-end deep learning problem and solve it by proposing a novel negative log-likelihood error (NLE) loss function.
Deep learning observables in computational fluid dynamics
Under the assumption that the underlying neural networks generalize well, we prove that the deep learning MC and QMC algorithms are guaranteed to be faster than the baseline (quasi-) Monte Carlo methods.
Bayesian Meta-Learning for the Few-Shot Setting via Deep Kernels
Recently, different machine learning methods have been introduced to tackle the challenging few-shot learning scenario that is, learning from a small labeled dataset related to a specific task.