日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

会議論文

Training sparse natural image models with a fast Gibbs sampler of an extended state space

MPS-Authors
There are no MPG-Authors in the publication available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)
公開されているフルテキストはありません
付随資料 (公開)
There is no public supplementary material available
引用

Theis, L., Sohl-Dickstein, J., & Bethge, M. (2013). Training sparse natural image models with a fast Gibbs sampler of an extended state space. In P., Bartlett, F., Pereira, L., Bottou, C., Burges, & K., Weinberger (Eds.), Twenty-Sixth Annual Conference on Neural Information Processing Systems (NIPS 2012) (pp. 1133-1141). Red Hook, NY, USA: Curran.


引用: https://hdl.handle.net/21.11116/0000-0001-1C34-6
要旨
We present a new learning strategy based on an efficient blocked Gibbs sampler for sparse overcomplete linear models. Particular emphasis is placed on statistical image modeling, where overcomplete models have played an important role in discovering sparse representations. Our Gibbs sampler is faster than general purpose sampling schemes while also requiring no tuning as it is free of parameters. Using the Gibbs sampler and a persistent variant of expectation maximization, we are able to extract highly sparse distributions over latent sources from data. When applied to natural images, our algorithm learns source distributions which resemble spike-and-slab distributions. We evaluate the likelihood and quantitatively compare the performance of the overcomplete linear model to its complete counterpart as well as a product of experts model, which represents another overcomplete generalization of the complete linear model. In contrast to previous claims, we find that overcomplete representations lead to significant improvements, but that the overcomplete linear model still underperforms other models.