Help Privacy Policy Disclaimer
  Advanced SearchBrowse





A decision-theoretic approach to binocular rivalry


Safavi,  S       
Department of Computational Neuroscience, Max Planck Institute for Biological Cybernetics, Max Planck Society;


Logothetis,  N       
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;


Levina,  A       
Institutional Guests, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available

Safavi, S., Chalk, M., Logothetis, N., & Levina, A. (2023). A decision-theoretic approach to binocular rivalry. Poster presented at 32nd Annual Computational Neuroscience Meeting (CNS*2023), Leipzig, Germany.

Cite as: https://hdl.handle.net/21.11116/0000-000D-ACAB-1
Over the last decades, multiple studies have reported signatures of criticality observed in various neuronal recordings. Moreover, theoretical investigations demonstrate that multiple aspects of information processing are optimized at the second-order phase transition. These studies motivated the hypothesis that the brain operates close to a critical state. While it has been shown that several computational aspects of sensory information processing (e. g., sensitivity to input, dynamic range, or information-transmission) are optimal in this regime, it is still unclear whether these computational benefits of criticality can be leveraged by neural systems for performing behaviorally relevant computations.
To address this question, we investigate signatures of criticality in networks optimized to perform efficient encoding of stimuli. We consider a network of leaky integrate-and-fire neurons with synaptic transmission delays whose connectivity and dynamics are optimized for efficient coding. Previously, it was shown that the performance of such networks varies non-monotonically with the noise amplitude. We consider networks with different noise amplitudes and evaluate how close they are to the critical state by measuring deviations from the nearest power law of avalanche size distributions. Interestingly, we find that in the vicinity of the optimal noise level for efficient coding, the network dynamics exhibits signatures of criticality, namely, the distribution of avalanche sizes is closest to a power law. When the noise amplitude is too low or too high for efficient coding, the network appears either super-critical or sub-critical, respectively.
Lastly, we verify the stability of this result to changes in the network’s size by considering networks of various sizes in a range between N = 50 and N = 400 neurons. We find that all networks demonstrate similar non-monotonic behavior for the dependence of reconstruction error and deviation from scale-free dynamics on the strength of the noise. The location of the cut-off of the scale-free distribution shifts right (to the larger values) with the size of the network, hinting at the correct finite-size scaling behavior. This result has important implications, as it shows how two influential, and previously disparate fields - efficient coding, and criticality - might be intimately related.