English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Training sparse natural image models with a fast Gibbs sampler of an extended state space

Theis, L., Sohl-Dickstein, J., & Bethge, M. (2013). Training sparse natural image models with a fast Gibbs sampler of an extended state space. In P. Bartlett, F. Pereira, L. Bottou, C. Burges, & K. Weinberger (Eds.), Twenty-Sixth Annual Conference on Neural Information Processing Systems (NIPS 2012) (pp. 1133-1141). Red Hook, NY, USA: Curran.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/21.11116/0000-0001-1C34-6 Version Permalink: http://hdl.handle.net/21.11116/0000-0001-1C35-5
Genre: Conference Paper

Files

show Files

Creators

show
hide
 Creators:
Theis, Lucas1, Author              
Sohl-Dickstein, J, Author
Bethge, Matthias1, Author              
Affiliations:
1Werner Reichardt Centre for Integrative Neuroscience, ou_persistent22              

Content

show
hide
Free keywords: -
 Abstract: We present a new learning strategy based on an efficient blocked Gibbs sampler for sparse overcomplete linear models. Particular emphasis is placed on statistical image modeling, where overcomplete models have played an important role in discovering sparse representations. Our Gibbs sampler is faster than general purpose sampling schemes while also requiring no tuning as it is free of parameters. Using the Gibbs sampler and a persistent variant of expectation maximization, we are able to extract highly sparse distributions over latent sources from data. When applied to natural images, our algorithm learns source distributions which resemble spike-and-slab distributions. We evaluate the likelihood and quantitatively compare the performance of the overcomplete linear model to its complete counterpart as well as a product of experts model, which represents another overcomplete generalization of the complete linear model. In contrast to previous claims, we find that overcomplete representations lead to significant improvements, but that the overcomplete linear model still underperforms other models.

Details

show
hide
Language(s):
 Dates: 2013-04
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: -
 Identifiers: Other: TheisSB2013
 Degree: -

Event

show
hide
Title: Twenty-Sixth Annual Conference on Neural Information Processing Systems (NIPS 2012)
Place of Event: Lake Tahoe, NV, USA
Start-/End Date: 2012-12-03 - 2012-12-08

Legal Case

show

Project information

show

Source 1

show
hide
Title: Twenty-Sixth Annual Conference on Neural Information Processing Systems (NIPS 2012)
Source Genre: Proceedings
 Creator(s):
Bartlett, P, Editor
Pereira, FCN, Editor
Bottou, L, Editor
Burges, CJC, Editor
Weinberger, KQ, Editor
Affiliations:
-
Publ. Info: Red Hook, NY, USA : Curran
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 1133 - 1141 Identifier: -