# Item

ITEM ACTIONSEXPORT

Released

Journal Article

#### Estimation bias in maximum entropy models

##### External Ressource

http://www.mdpi.com/1099-4300/15/8/3109/pdf

(Publisher version)

##### Fulltext (public)

There are no public fulltexts stored in PuRe

##### Supplementary Material (public)

There is no public supplementary material available

##### Citation

Macke, J., Murray, I., & Latham, P. (2013). Estimation bias in maximum entropy
models.* Entropy,* *15*(8), 3109-3219. doi:10.3390/e15083109.

Cite as: http://hdl.handle.net/11858/00-001M-0000-001A-135F-1

##### Abstract

Maximum entropy models have become popular statistical models in neuroscience and other areas in biology and can be useful tools for obtaining estimates of mutual information in biological systems. However, maximum entropy models fit to small data sets can be subject to sampling bias; i.e., the true entropy of the data can be severely underestimated. Here, we study the sampling properties of estimates of the entropy obtained from maximum entropy models. We focus on pairwise binary models, which are used extensively to model neural population activity. We show that if the data is well described by a pairwise model, the bias is equal to the number of parameters divided by twice the number of observations. If, however, the higher order correlations in the data deviate from those predicted by the model, the bias can be larger. Using a phenomenological model of neural population recordings, we find that this additional bias is highest for small firing probabilities, strong correlations and large population sizes—for the parameters we tested, a factor of about four higher. We derive guidelines for how long a neurophysiological experiment needs to be in order to ensure that the bias is less than a specified criterion. Finally, we show how a modified plug-in estimate of the entropy can be used for bias correction.