User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse




Book Chapter

Estimating Information-Theoretic Quantities


Panzeri,  S
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available

Ince, R., Schultz, S., & Panzeri, S. (2015). Estimating Information-Theoretic Quantities. In D. Jaeger (Ed.), Encyclopedia of Computational Neuroscience (pp. 1137-1148). New York, NY, USA: Springer.

Cite as: http://hdl.handle.net/11858/00-001M-0000-002A-47C5-7
Information theory is a practical and theoretic framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. It has a number of useful properties: it is a general measure sensitive to any relationship, not only linear effects; it has meaningful units which in many cases allow direct comparison between different experiments; and it can be used to study how much information can be gained by observing neural responses in single trials, rather than in averages over multiple trials. A variety of information-theoretic quantities are in common use in neuroscience (see entry “Summary of Information Theoretic Quantities”). Estimating these quantities in an accurate and unbiased way from real neurophysiological data frequently presents challenges, which are explained in this entry.