English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Mixtures of conditional Gaussian scale mixtures: the best model for natural images

Theis, L., Hosseini, R., & Bethge, M. (2012). Mixtures of conditional Gaussian scale mixtures: the best model for natural images. Poster presented at Bernstein Conference 2012, München, Germany. doi:10.3389/conf.fncom.2012.55.00079.

Item is

Files

show Files

Locators

show
hide
Description:
-
OA-Status:

Creators

show
hide
 Creators:
Theis, LM, Author           
Hosseini, R1, 2, Author           
Bethge, M1, 2, Author           
Affiliations:
1Research Group Computational Vision and Neuroscience, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497805              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: Modeling the statistics of natural images is a common problem in computer vision and computational neuroscience. In computational neuroscience, natural image models are used as a means to understand the input to the visual system as well as the visual system’s internal representations of the visual input. Here we present a new probabilistic model for images of arbitrary size. Our model is a directed graphical model based on mixtures of Gaussian scale mixtures. Gaussian scale mixtures have been repeatedly shown to be suitable building blocks for capturing the statistics of natural images, but have not been applied in a directed modeling context. Perhaps surprisingly—given the much larger popularity of the undirected Markov random field approach—our directed model yields unprecedented performance when applied to natural images while also being easier to train, sample and evaluate. Samples from the model look much more natural than samples of other models and capture many long-range higher-order correlations. When trained on dead leave images or textures, the model is able to reproduce many properties of these as well—showing the flexibility of our model. By extending the model to multiscale representations, it is able to reproduce even longer-range correlations. An important measure to quantify the amount of correlations captured by a model is the average log-likelihood. We evaluate our model as well as several other patch-based and whole-image models and show that it yields the best performance reported to date when measured in bits per pixel. A problem closely related to image modeling is image compression. We show that our model can compete even with some of the best image compression algorithms.

Details

show
hide
Language(s):
 Dates: 2012-09
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: DOI: 10.3389/conf.fncom.2012.55.00079
BibTex Citekey: TheisHB2012
 Degree: -

Event

show
hide
Title: Bernstein Conference 2012
Place of Event: München, Germany
Start-/End Date: -

Legal Case

show

Project information

show

Source 1

show
hide
Title: Frontiers in Computational Neuroscience
  Abbreviation : Front Comput Neurosci
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Lausanne : Frontiers Research Foundation
Pages: - Volume / Issue: 2012 (Conference Abstract: Bernstein Conference 2012) Sequence Number: - Start / End Page: 247 Identifier: Other: 1662-5188
CoNE: https://pure.mpg.de/cone/journals/resource/1662-5188