Normalizing Flows for Calibration and Recalibration

1 Jan 2021  ·  Achintya Gopal, Aaron Key ·

In machine learning, due to model misspecification and overfitting, estimates of the aleatoric uncertainty are often inaccurate. One approach to fix this is isotonic regression, in which a monotonic function is fit on a validation set to map the model's CDF to an optimally calibrated CDF. However, this makes it infeasible to compute additional statistics of interest on the model distribution (such as the mean). In this paper, we replace isotonic regression with normalizing flows. This allows us to retain the ability to compute the statistical properties of the model and provides an opportunity for additional capacity at the cost of possible overfitting. This approach also gives closed-form likelihoods as long as the original model distribution is also closed-form. Most importantly, the fundamental properties of normalizing flows allow us to recalibrate correlations. To aid in detecting miscalibration and measuring our success at fixing it, we introduce a new CDF Performance Plot, allowing a practitioner to diagnose calibration issues at a glance.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods