English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Paper

"Best-of-Many-Samples" Distribution Matching

MPS-Authors
/persons/resource/persons197297

Bhattacharyya,  Apratim
Computer Vision and Machine Learning, MPI for Informatics, Max Planck Society;

/persons/resource/persons45383

Schiele,  Bernt
Computer Vision and Machine Learning, MPI for Informatics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

arXiv:1909.12598.pdf
(Preprint), 5MB

Supplementary Material (public)
There is no public supplementary material available
Citation

Bhattacharyya, A., Fritz, M., & Schiele, B. (2019). "Best-of-Many-Samples" Distribution Matching. Retrieved from http://arxiv.org/abs/1909.12598.


Cite as: https://hdl.handle.net/21.11116/0000-0005-554A-9
Abstract
Generative Adversarial Networks (GANs) can achieve state-of-the-art sample
quality in generative modelling tasks but suffer from the mode collapse
problem. Variational Autoencoders (VAE) on the other hand explicitly maximize a
reconstruction-based data log-likelihood forcing it to cover all modes, but
suffer from poorer sample quality. Recent works have proposed hybrid VAE-GAN
frameworks which integrate a GAN-based synthetic likelihood to the VAE
objective to address both the mode collapse and sample quality issues, with
limited success. This is because the VAE objective forces a trade-off between
the data log-likelihood and divergence to the latent prior. The synthetic
likelihood ratio term also shows instability during training. We propose a
novel objective with a "Best-of-Many-Samples" reconstruction cost and a stable
direct estimate of the synthetic likelihood. This enables our hybrid VAE-GAN
framework to achieve high data log-likelihood and low divergence to the latent
prior at the same time and shows significant improvement over both hybrid
VAE-GANS and plain GANs in mode coverage and quality.