English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Local dendritic balance enables learning of efficient representations in networks of spiking neurons

MPS-Authors
/persons/resource/persons258568

Mikulasch,  Fabian
Max Planck Research Group Neural Systems Theory, Max Planck Institute for Dynamics and Self-Organization, Max Planck Society;

/persons/resource/persons207005

Rudelt,  Lucas
Max Planck Research Group Neural Systems Theory, Max Planck Institute for Dynamics and Self-Organization, Max Planck Society;

/persons/resource/persons173619

Priesemann,  Viola
Max Planck Research Group Neural Systems Theory, Max Planck Institute for Dynamics and Self-Organization, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Mikulasch, F., Rudelt, L., & Priesemann, V. (2021). Local dendritic balance enables learning of efficient representations in networks of spiking neurons. Proceedings of the National Academy of Sciences, 118(50): e2021925118. doi:10.1073/pnas.2021925118.


Cite as: https://hdl.handle.net/21.11116/0000-0009-9B81-6
Abstract
How can neural networks learn to efficiently represent complex and high-dimensional inputs via local plasticity mechanisms? Classical models of representation learning assume that feedforward weights are learned via pairwise Hebbian-like plasticity. Here, we show that pairwise Hebbian-like plasticity works only under unrealistic requirements on neural dynamics and input statistics. To overcome these limitations, we derive from first principles a learning scheme based on voltage-dependent synaptic plasticity rules. Here, recurrent connections learn to locally balance feedforward input in individual dendritic compartments and thereby can modulate synaptic plasticity to learn efficient representations. We demonstrate in simulations that this learning scheme works robustly even for complex high-dimensional inputs and with inhibitory transmission delays, where Hebbian-like plasticity fails. Our results draw a direct connection between dendritic excitatory–inhibitory balance and voltage-dependent synaptic plasticity as observed in vivo and suggest that both are crucial for representation learning.