English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  A Measure of the Complexity of Neural Representations based on Partial Information Decomposition

Ehrlich, D. A., Schneider, A. C., Priesemann, V., Wibral, M., & Makkeh, A. (2023). A Measure of the Complexity of Neural Representations based on Partial Information Decomposition. Transactions on Machine Learning Research, 05/2023.

Item is

Files

show Files
hide Files
:
716_a_measure_of_the_complexity_of.pdf (Publisher version), 2MB
Name:
716_a_measure_of_the_complexity_of.pdf
Description:
-
OA-Status:
Not specified
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show
hide
Description:
-
OA-Status:
Not specified

Creators

show
hide
 Creators:
Ehrlich, David Alexander1, Author           
Schneider, Andreas Christian1, Author           
Priesemann, Viola1, Author           
Wibral, Michael, Author
Makkeh, Abdullah, Author
Affiliations:
1Max Planck Research Group Complex Systems Theory, Max Planck Institute for Dynamics and Self-Organization, Max Planck Society, ou_2616694              

Content

show
hide
Free keywords: -
 Abstract: In neural networks, task-relevant information is represented jointly by groups of neurons. However, the specific way in which this mutual information about the classification label is distributed among the individual neurons is not well understood: While parts of it may only be obtainable from specific single neurons, other parts are carried redundantly or synergistically by multiple neurons. We show how Partial Information Decomposition (PID), a recent extension of information theory, can disentangle these different contributions. From this, we introduce the measure of "Representational Complexity", which quantifies the difficulty of accessing information spread across multiple neurons. We show how this complexity is directly computable for smaller layers. For larger layers, we propose subsampling and coarse-graining procedures and prove corresponding bounds on the latter. Empirically, for quantized deep neural networks solving the MNIST and CIFAR10 tasks, we observe that representational complexity decreases both through successive hidden layers and over training, and compare the results to related measures. Overall, we propose representational complexity as a principled and interpretable summary statistic for analyzing the structure and evolution of neural representations and complex systems in general.

Details

show
hide
Language(s): eng - English
 Dates: 2023
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: -
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Transactions on Machine Learning Research
  Abbreviation : Transact. mach. learn. res.
  Other : TMLR
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: -
Pages: 31 Volume / Issue: 05/2023 Sequence Number: - Start / End Page: - Identifier: ISSN: 2835-8856