12/07/2020

All in the (Exponential) Family: Information Geometry and Thermodynamic Variational Inference

Rob Brekelmans, Vaden Masrani, Frank Wood, Greg Ver Steeg, Aram Galstyan

Keywords: Probabilistic Inference - Models and Probabilistic Programming

Abstract: While the Evidence Lower Bound (ELBO) has become a ubiquitous objective for variational inference, the recently proposed Thermodynamic Variational Objective (TVO) leverages thermodynamic integration to provide a tighter and more general family of bounds. In previous work, the tightness of these bounds was not known, grid search was used to choose a `schedule' of intermediate distributions, and model learning suffered with ostensibly tighter bounds. We interpret the geometric mixture curve common to TVO and related path sampling methods using the geometry of exponential families, which allows us to characterize the gap in TVO bounds as a sum of KL divergences along a given path. Further, we propose a principled technique for choosing intermediate distributions using equal spacing in the moment parameters of our exponential family. We demonstrate that this scheduling approach adapts to the shape of the integrand defining the TVO objective and improves overall performance. Additionally, we derive a reparameterized gradient estimator which empirically allows the TVO to benefit from additional, well chosen partitions. Finally, we provide a unified framework for understanding thermodynamic integration and the TVO in terms of Taylor series remainders.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ICML 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers