English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Estimation Bias in Maximum Entropy Models

Macke, J. H., Murray, I., & Latham, P. E. (2013). Estimation Bias in Maximum Entropy Models. Entropy, 15(8), 3109-3129. doi:Doi 10.3390/E15083209.

Item is

Files

show Files

Locators

show

Creators

show
hide
 Creators:
Macke, J. H.1, Author
Murray, I., Author
Latham, P. E., Author
Affiliations:
1External Organizations, ou_persistent22              

Content

show
hide
Free keywords: maximum entropy sampling bias asymptotic bias model-misspecification neurophysiology neural population coding ising model dichotomized gaussian higher-order interactions mutual information cortical networks primate retina spike trains distributions population diversity cortex
 Abstract: Maximum entropy models have become popular statistical models in neuroscience and other areas in biology and can be useful tools for obtaining estimates of mutual information in biological systems. However, maximum entropy models fit to small data sets can be subject to sampling bias; i.e., the true entropy of the data can be severely underestimated. Here, we study the sampling properties of estimates of the entropy obtained from maximum entropy models. We focus on pairwise binary models, which are used extensively to model neural population activity. We show that if the data is well described by a pairwise model, the bias is equal to the number of parameters divided by twice the number of observations. If, however, the higher order correlations in the data deviate from those predicted by the model, the bias can be larger. Using a phenomenological model of neural population recordings, we find that this additional bias is highest for small firing probabilities, strong correlations and large population sizes-for the parameters we tested, a factor of about four higher. We derive guidelines for how long a neurophysiological experiment needs to be in order to ensure that the bias is less than a specified criterion. Finally, we show how a modified plug-in estimate of the entropy can be used for bias correction.

Details

show
hide
Language(s): eng - English
 Dates: 2013
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: ISI: 000328461300010
ISI: ISI:WOS:000328461300010
DOI: Doi 10.3390/E15083209
ISSN: 1099-4300
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Entropy
  Alternative Title : Entropy
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 15 (8) Sequence Number: - Start / End Page: 3109 - 3129 Identifier: -