On the Robustness of α-Posteriors to Model Misspecification

Cynthia Rush – Columbia University


Variational inference (VI) is a machine learning technique that approximates difficult-to-compute probability densities by using optimization. While VI has been used in numerous applications, it is particularly useful in Bayesian statistics where one wishes to perform statistical inference about unknown parameters through calculations on a posterior density. In this talk, I will review the core concepts of VI and discuss some new ideas about VI and robustness to model misspecification. In particular, we will study α-posteriors, which distort standard posterior inference by downweighting the likelihood, and their variational approximations. We will see that such distortions, if tuned appropriately, can outperform standard posterior inference when there is potential parametric model misspecification.