[edit]
Black Box Variational Inference and Deep Exponential Families
Profressor David Blei, Columbia University
Bayesian statistics and expressive probabilistic modeling have become key tools for the modern statistician. These tools let us express complex assumptions about the hidden elements that underlie our data, and they have been successfully applied in numerous fields. The central computational problem in Bayesian statistics is posterior inference, the problem of approximating the conditional distribution of the hidden variables given the observations. Approximate posterior inference algorithms have revolutionized the field, revealing its potential as a usable and general-purpose language for data analysis. In this talk, I will discuss two related innovations in modeling and inference: deep exponential families and black box variational inference. Deep exponential families (DEFs) adapt the main ideas behind deep learning to expressive probabilistic models. DEFs provide principled probabilistic models that can uncover layers of representations of high-dimensional data. I will show how to use DEFs to analyze text, recommendation data, and electronic health records. I will then discuss the key algorithm that enables DEFs: Black box variational inference (BBVI). BBVI is a generic and scalable algorithm for approximating the posterior. BBVI easily applies to many models, with little model-specific derivation and few restrictions on their properties.