12/07/2020

Latent Bernoulli Autoencoder

Jiri Fajtl, Vasileios Argyriou, Dorothy Monekosso, Paolo Remagnino

Keywords: Deep Learning - Generative Models and Autoencoders

Abstract: In this work, we pose a question whether it is possible to design and train an autoencoder model in an end-to-end fashion to learn latent representations in multivariate Bernoulli space, and achieve performance comparable with the current state-of-the-art variational methods. Moreover, we investigate how to generate novel samples and perform smooth interpolation in the binary latent space. To meet our objective, we propose a simplified deterministic model with a straight-through estimator to learn the binary latents and show its competitiveness with the latest VAE methods. Furthermore, we propose a novel method based on a random hyperplane rounding for sampling and smooth interpolation in the multivariate Bernoulli latent space. Although not a main objective, we demonstrate that our methods perform on par or better than the current state-of-the-art methods on common CelebA, CIFAR-10 and MNIST datasets. PyTorch code and trained models to reproduce published results will be released with the camera ready version.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ICML 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers