日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細

  Fading memory as inductive bias in residual recurrent networks

Dubinin, I., & Effenberger, F. (2024). Fading memory as inductive bias in residual recurrent networks. Neural Networks, 173:. doi:10.1016/j.neunet.2024.106179.

Item is

基本情報

表示: 非表示:
アイテムのパーマリンク: https://hdl.handle.net/21.11116/0000-000E-9D2B-2 版のパーマリンク: https://hdl.handle.net/21.11116/0000-000E-9D2C-1
資料種別: 学術論文

ファイル

表示: ファイル
非表示: ファイル
:
Dubinin_2024_FadingMemory.pdf (出版社版), 3MB
ファイルのパーマリンク:
https://hdl.handle.net/21.11116/0000-000E-9D2D-0
ファイル名:
Dubinin_2024_FadingMemory.pdf
説明:
-
OA-Status:
Hybrid
閲覧制限:
公開
MIMEタイプ / チェックサム:
application/pdf / [MD5]
技術的なメタデータ:
著作権日付:
2024
著作権情報:
© 2024 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).

関連URL

表示:

作成者

表示:
非表示:
 作成者:
Dubinin, Igor1, 2, 著者
Effenberger, Felix1, 2, 著者
所属:
1Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, Max Planck Society, Deutschordenstr. 46, 60528 Frankfurt, DE, ou_2074314              
2Singer Lab, Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max Planck Society, Max Planck Society, Deutschordenstraße 46, 60528 Frankfurt, DE, ou_3381220              

内容説明

表示:
非表示:
キーワード: Recurrent neural network Inductive bias Residual connection Memory
 要旨: Residual connections have been proposed as an architecture-based inductive bias to mitigate the problem of exploding and vanishing gradients and increased task performance in both feed-forward and recurrent networks (RNNs) when trained with the backpropagation algorithm. Yet, little is known about how residual connections in RNNs influence their dynamics and fading memory properties. Here, we introduce weakly coupled residual recurrent networks (WCRNNs) in which residual connections result in well-defined Lyapunov exponents and allow for studying properties of fading memory. We investigate how the residual connections of WCRNNs influence their performance, network dynamics, and memory properties on a set of benchmark tasks. We show that several distinct forms of residual connections yield effective inductive biases that result in increased network expressivity. In particular, those are residual connections that (i) result in network dynamics at the proximity of the edge of chaos, (ii) allow networks to capitalize on characteristic spectral properties of the data, and (iii) result in heterogeneous memory properties. In addition, we demonstrate how our results can be extended to non-linear residuals and introduce a weakly coupled residual initialization scheme that can be used for Elman RNNs.

資料詳細

表示:
非表示:
言語:
 日付: 2024-02-152024-05
 出版の状態: 出版
 ページ: -
 出版情報: -
 目次: -
 査読: 査読あり
 識別子(DOI, ISBNなど): DOI: 10.1016/j.neunet.2024.106179
 学位: -

関連イベント

表示:

訴訟

表示:

Project information

表示:

出版物 1

表示:
非表示:
出版物名: Neural Networks
種別: 学術雑誌
 著者・編者:
所属:
出版社, 出版地: -
ページ: - 巻号: 173 通巻号: 106179 開始・終了ページ: - 識別子(ISBN, ISSN, DOIなど): ISSN: 08936080