日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

学術論文

Lyapunov spectra of chaotic recurrent neural networks

MPS-Authors
/persons/resource/persons215422

Engelken,  Rainer
Research Group Theoretical Neurophysics, Max Planck Institute for Dynamics and Self-Organization, Max Planck Society;

/persons/resource/persons173710

Wolf,  Fred
Research Group Theoretical Neurophysics, Max Planck Institute for Dynamics and Self-Organization, Max Planck Society;

External Resource
There are no locators available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)

PhysRevResearch.5.043044.pdf
(出版社版), 4MB

付随資料 (公開)
There is no public supplementary material available
引用

Engelken, R., Wolf, F., & Abbott, L. F. (2023). Lyapunov spectra of chaotic recurrent neural networks. Physical Review Research, 5(4):. doi:10.1103/PhysRevResearch.5.043044.


引用: https://hdl.handle.net/21.11116/0000-000D-F829-E
要旨
Recurrent networks are widely used as models of biological neural circuits and in artificial intelligence applications. Mean-field theory has been used to uncover key properties of recurrent network models such as the onset of chaos and their largest Lyapunov exponents, but quantities such as attractor dimension and Kolmogorov-Sinai entropy have thus far remained elusive. We calculate the complete Lyapunov spectrum of recurrent neural networks and show that chaos in these networks is extensive with a size-invariant Lyapunov spectrum and attractor dimensions much smaller than the number of phase space dimensions. The attractor dimension and entropy rate increase with coupling strength near the onset of chaos but decrease far from the onset, reflecting a reduction in the number of unstable directions. We analytically approximate the full Lyapunov spectrum using random matrix theory near the onset of chaos for strong coupling and discrete-time dynamics. We show that a generalized time-reversal symmetry of the network dynamics induces a point symmetry of the Lyapunov spectrum reminiscent of the symplectic structure of chaotic Hamiltonian systems. Temporally fluctuating input can drastically reduce both the entropy rate and the attractor dimension. We lay out a comprehensive set of controls for the accuracy and convergence of Lyapunov exponents. For trained recurrent networks, we find that Lyapunov spectrum analysis quantifies error propagation and stability achieved by different learning algorithms. Our methods apply to systems of arbitrary connectivity and highlight the potential of Lyapunov spectrum analysis as a diagnostic for machine learning applications of recurrent networks.