English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Dendritic normalisation improves learning in sparsely connected artificial neural networks

MPS-Authors

Bird,  Alex D.
Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, Max Planck Society;
Cuntz Lab, Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, Max Planck Society;

/persons/resource/persons38794

Cuntz,  Hermann       
Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, Max Planck Society;
Cuntz Lab, Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

Bird_2021_DendriticNormalisation.pdf
(Publisher version), 3MB

Supplementary Material (public)
There is no public supplementary material available
Citation

Bird, A. D., Jedlicka, P., & Cuntz, H. (2021). Dendritic normalisation improves learning in sparsely connected artificial neural networks. PLoS Computational Biology, 17(8): e1009202. doi:10.1371/journal.pcbi.1009202.


Cite as: https://hdl.handle.net/21.11116/0000-0009-D14D-5
Abstract
Artificial neural networks, taking inspiration from biological neurons, have become an invaluable tool for machine learning applications. Recent studies have developed techniques to effectively tune the connectivity of sparsely-connected artificial neural networks, which have the potential to be more computationally efficient than their fully-connected counterparts and more closely resemble the architectures of biological systems. We here present a normalisation, based on the biophysical behaviour of neuronal dendrites receiving distributed synaptic inputs, that divides the weight of an artificial neuron's afferent contacts by their number. We apply this dendritic normalisation to various sparsely-connected feedforward network architectures, as well as simple recurrent and self-organised networks with spatially extended units. The learning performance is significantly increased, providing an improvement over other widely-used normalisations in sparse networks. The results are two-fold, being both a practical advance in machine learning and an insight into how the structure of neuronal dendritic arbours may contribute to computation.