English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Lyapunov spectra of chaotic recurrent neural networks

Engelken, R., Wolf, F., & Abbott, L. F. (2023). Lyapunov spectra of chaotic recurrent neural networks. Physical Review Research, 5(4): 043044. doi:10.1103/PhysRevResearch.5.043044.

Item is

Files

show Files
hide Files
:
PhysRevResearch.5.043044.pdf (Publisher version), 4MB
Name:
PhysRevResearch.5.043044.pdf
Description:
-
OA-Status:
Gold
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-

Locators

show

Creators

show
hide
 Creators:
Engelken, Rainer1, Author           
Wolf, Fred1, Author           
Abbott, L. F., Author
Affiliations:
1Research Group Theoretical Neurophysics, Max Planck Institute for Dynamics and Self-Organization, Max Planck Society, ou_2063289              

Content

show
hide
Free keywords: -
 Abstract: Recurrent networks are widely used as models of biological neural circuits and in artificial intelligence applications. Mean-field theory has been used to uncover key properties of recurrent network models such as the onset of chaos and their largest Lyapunov exponents, but quantities such as attractor dimension and Kolmogorov-Sinai entropy have thus far remained elusive. We calculate the complete Lyapunov spectrum of recurrent neural networks and show that chaos in these networks is extensive with a size-invariant Lyapunov spectrum and attractor dimensions much smaller than the number of phase space dimensions. The attractor dimension and entropy rate increase with coupling strength near the onset of chaos but decrease far from the onset, reflecting a reduction in the number of unstable directions. We analytically approximate the full Lyapunov spectrum using random matrix theory near the onset of chaos for strong coupling and discrete-time dynamics. We show that a generalized time-reversal symmetry of the network dynamics induces a point symmetry of the Lyapunov spectrum reminiscent of the symplectic structure of chaotic Hamiltonian systems. Temporally fluctuating input can drastically reduce both the entropy rate and the attractor dimension. We lay out a comprehensive set of controls for the accuracy and convergence of Lyapunov exponents. For trained recurrent networks, we find that Lyapunov spectrum analysis quantifies error propagation and stability achieved by different learning algorithms. Our methods apply to systems of arbitrary connectivity and highlight the potential of Lyapunov spectrum analysis as a diagnostic for machine learning applications of recurrent networks.

Details

show
hide
Language(s): eng - English
 Dates: 2023-10-162023-12
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1103/PhysRevResearch.5.043044
 Degree: -

Event

show

Legal Case

show

Project information

show hide
Project name : Research supported by NSF NeuroNex Award (Grant No. DBI-1707398), the Gatsby Charitable Foundation (Grant No. GAT3708), the Simons Collaboration for the Global Brain (542939SPI), and the Swartz Foundation (2019-5). This work was further supported by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) 436260547 in relation to NeuroNex (National Science Foundation 2015276) & under Germany's Excellence Strategy - EXC 2067/1- 390729940, DFG - Project-ID 317475864 - SFB 1286, DFG - Project-ID 454648639 - SFB 1528, DFG - Project-ID 273725443 - SPP 1782, DFG - Project-ID 430156276 - SPP 2205, and by the Leibniz Association (project K265/2019) (F.W.).
Grant ID : -
Funding program : -
Funding organization : -

Source 1

show
hide
Title: Physical Review Research
  Abbreviation : Phys. Rev. Research
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: College Park, Maryland, United States : American Physical Society (APS)
Pages: 28 Volume / Issue: 5 (4) Sequence Number: 043044 Start / End Page: - Identifier: ISSN: 2643-1564
CoNE: https://pure.mpg.de/cone/journals/resource/2643-1564