English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Information-theoretic analyses of neural data to minimize the effect of researchers’ assumptions in predictive coding studies

Wollstadt, P., Rathbun, D., Usrey, W., Bastos, A., Lindner, M., Priesemann, V., et al. (2023). Information-theoretic analyses of neural data to minimize the effect of researchers’ assumptions in predictive coding studies. PLoS Computational Biology, 19(11): e1011567. doi:10.1371/journal.pcbi.1011567.

Item is

Files

show Files
hide Files
:
journal.pcbi.1011567.pdf (Publisher version), 3MB
Name:
journal.pcbi.1011567.pdf
Description:
-
OA-Status:
Gold
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-

Locators

show

Creators

show
hide
 Creators:
Wollstadt, P., Author
Rathbun, D.L., Author
Usrey, W.M., Author
Bastos, A.M., Author
Lindner, M., Author
Priesemann, Viola1, Author           
Wibral, M., Author
Affiliations:
1Max Planck Research Group Complex Systems Theory, Max Planck Institute for Dynamics and Self-Organization, Max Planck Society, ou_2616694              

Content

show
hide
Free keywords: -
 Abstract: Studies investigating neural information processing often implicitly ask both, which processing strategy out of several alternatives is used and how this strategy is implemented in neural dynamics. A prime example are studies on predictive coding. These often ask whether confirmed predictions about inputs or prediction errors between internal predictions and inputs are passed on in a hierarchical neural system—while at the same time looking for the neural correlates of coding for errors and predictions. If we do not know exactly what a neural system predicts at any given moment, this results in a circular analysis—as has been criticized correctly. To circumvent such circular analysis, we propose to express information processing strategies (such as predictive coding) by local information-theoretic quantities, such that they can be estimated directly from neural data. We demonstrate our approach by investigating two opposing accounts of predictive coding-like processing strategies, where we quantify the building blocks of predictive coding, namely predictability of inputs and transfer of information, by local active information storage and local transfer entropy. We define testable hypotheses on the relationship of both quantities, allowing us to identify which of the assumed strategies was used. We demonstrate our approach on spiking data collected from the retinogeniculate synapse of the cat (N = 16). Applying our local information dynamics framework, we are able to show that the synapse codes for predictable rather than surprising input. To support our findings, we estimate quantities applied in the partial information decomposition framework, which allow to differentiate whether the transferred information is primarily bottom-up sensory input or information transferred conditionally on the current state of the synapse. Supporting our local information-theoretic results, we find that the synapse preferentially transfers bottom-up information.

Details

show
hide
Language(s): eng - English
 Dates: 2023-11-17
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1371/journal.pcbi.1011567
 Degree: -

Event

show

Legal Case

show

Project information

show hide
Project name : ---
Grant ID : -
Funding program : -
Funding organization : -

Source 1

show
hide
Title: PLoS Computational Biology
  Abbreviation : PLoS Comput Biol
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: San Francisco, CA : Public Library of Science
Pages: - Volume / Issue: 19 (11) Sequence Number: e1011567 Start / End Page: - Identifier: ISSN: 1553-734X
CoNE: https://pure.mpg.de/cone/journals/resource/1000000000017180_1