Generalized Bayesian Conformal Inference

In practice, prediction intervals are oftentimes poorly calibrated with coverage levels far from the nominal values. This might be due to overshrinkage or prior misspecification, for example. Various approaches have been taken to remedy this; one that has enjoyed recent success is conformal quantile regression (CQR). CQR uses quantile estimators on a training set to learn a “naive” interval and then adjusts the interval based off its coverage on a separate, calibration set. The resulting intervals are guaranteed finite sample coverage. Unfortunately, the underlying cause of the poor coverage, like an overly small variance estimate, is not remedied by doing so. We propose a Generalized Bayesian procedure for learning a calibrated posterior distribution that minimizes a discrepancy from an initial posterior subject to a conformal constraint guaranteeing validity of prediction intervals. To our knowledge, this is the first procedure to use conformal adjustments to predictive intervals to impact parameter inference.

Vittorio Orlandi
Vittorio Orlandi
PhD in Statistics

I create methods for interpretable causal inference and Bayesian nonparametrics.