日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細

 前へ次へ 

公開

書籍の一部

Summary of Information Theoretic Quantities

MPS-Authors
/persons/resource/persons84966

Panzeri,  S
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)
公開されているフルテキストはありません
付随資料 (公開)
There is no public supplementary material available
引用

Ince, R., Panzeri, S., & Schultz, S. (2015). Summary of Information Theoretic Quantities. In D., Jaeger, & R., Jung (Eds.), Encyclopedia of Computational Neuroscience (pp. 2924-2928). New York, NY, USA: Springer.


引用: https://hdl.handle.net/11858/00-001M-0000-002A-47D1-D
要旨
Information theory is a practical and theoretic framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. As a framework, it has a number of useful properties: it provides a general measure sensitive to any relationship, not only linear effects; its quantities have meaningful units which, in many cases, allow a direct comparison between different experiments; and it can be used to study how much information can be gained by observing neural responses in single experimental trials rather than in averages over multiple trials. A variety of information theoretic quantities are in common use in neuroscience – including the Shannon entropy, Kullback–Leibler divergence, and mutual information. In this entry, we introduce and define these quantities. Further details on how these quantities can be estimated in practice are provided in the entry “Estimation of Information-Theoretic Quantities,” and examples of application of these techniques in neuroscience can be found in the entry “Applications of Information Theory to Analysis of Neural Data.”