English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Training sparse natural image models with a fast Gibbs sampler of an extended state space

MPS-Authors
There are no MPG-Authors in the publication available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Theis, L., Sohl-Dickstein, J., & Bethge, M. (2013). Training sparse natural image models with a fast Gibbs sampler of an extended state space. In P. Bartlett, F. Pereira, L. Bottou, C. Burges, & K. Weinberger (Eds.), Twenty-Sixth Annual Conference on Neural Information Processing Systems (NIPS 2012) (pp. 1133-1141). Red Hook, NY, USA: Curran.


Cite as: https://hdl.handle.net/21.11116/0000-0001-1C34-6
Abstract
We present a new learning strategy based on an efficient blocked Gibbs sampler for sparse overcomplete linear models. Particular emphasis is placed on statistical image modeling, where overcomplete models have played an important role in discovering sparse representations. Our Gibbs sampler is faster than general purpose sampling schemes while also requiring no tuning as it is free of parameters. Using the Gibbs sampler and a persistent variant of expectation maximization, we are able to extract highly sparse distributions over latent sources from data. When applied to natural images, our algorithm learns source distributions which resemble spike-and-slab distributions. We evaluate the likelihood and quantitatively compare the performance of the overcomplete linear model to its complete counterpart as well as a product of experts model, which represents another overcomplete generalization of the complete linear model. In contrast to previous claims, we find that overcomplete representations lead to significant improvements, but that the overcomplete linear model still underperforms other models.