26/08/2020

Gaussianization Flows

Chenlin Meng, Yang Song, Jiaming Song, Stefano Ermon

Keywords:

Abstract: Iterative Gaussianization is a fixed-point iteration procedure that allows one to transform a continuous distribution to Gaussian distribution. Based on iterative Gaussianization, we propose a new type of normalizing flow models that grants both efficient computation of likelihoods and efficient inversion for sample generation. We demonstrate that this new family of flow models, named as Gaussianization flows, are universal approximators for continuous probability distributions under some regularity conditions. This guaranteed expressivity, enabling them to capture multimodal target distributions better without compromising the efficiency in sample generation. Experimentally, we show that Gaussianization flows achieve better or comparable performance on several tabular datasets, compared to other efficiently invertible flow models such as Real NVP, Glow and FFJORD. In particular, Gaussianization flows are easier to initialize, demonstrate better robustness with respect to different transformations of the training data, and generalize better on small training sets.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at AISTATS 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers