Deutsch
 
Benutzerhandbuch Datenschutzhinweis Impressum Kontakt
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Konferenzbeitrag

Training sparse natural image models with a fast Gibbs sampler of an extended state space

MPG-Autoren
Es sind keine MPG-Autoren in der Publikation vorhanden
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Theis, L., Sohl-Dickstein, J., & Bethge, M. (2013). Training sparse natural image models with a fast Gibbs sampler of an extended state space. In P. Bartlett, F. Pereira, L. Bottou, C. Burges, & K. Weinberger (Eds.), Twenty-Sixth Annual Conference on Neural Information Processing Systems (NIPS 2012) (pp. 1133-1141). Red Hook, NY, USA: Curran.


Zitierlink: http://hdl.handle.net/21.11116/0000-0001-1C34-6
Zusammenfassung
We present a new learning strategy based on an efficient blocked Gibbs sampler for sparse overcomplete linear models. Particular emphasis is placed on statistical image modeling, where overcomplete models have played an important role in discovering sparse representations. Our Gibbs sampler is faster than general purpose sampling schemes while also requiring no tuning as it is free of parameters. Using the Gibbs sampler and a persistent variant of expectation maximization, we are able to extract highly sparse distributions over latent sources from data. When applied to natural images, our algorithm learns source distributions which resemble spike-and-slab distributions. We evaluate the likelihood and quantitatively compare the performance of the overcomplete linear model to its complete counterpart as well as a product of experts model, which represents another overcomplete generalization of the complete linear model. In contrast to previous claims, we find that overcomplete representations lead to significant improvements, but that the overcomplete linear model still underperforms other models.