English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  The impact of memory on learning sequence-to-sequence tasks

Seif, A., Loos, S., Tucci, G., Roldán, I., & Goldt, S. (2024). The impact of memory on learning sequence-to-sequence tasks. Machine Learning: Science and Technology, 5(1): 015053. doi:10.1088/2632-2153/ad2feb.

Item is

Files

show Files
hide Files
:
Seif_2024_Mach._Learn. _Sci._Technol._5_015053.pdf (Publisher version), 2MB
Name:
Seif_2024_Mach._Learn. _Sci._Technol._5_015053.pdf
Description:
-
OA-Status:
Hybrid
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-

Locators

show

Creators

show
hide
 Creators:
Seif, A., Author
Loos, S.A.M., Author
Tucci, G.1, Author           
Roldán, I., Author
Goldt, S., Author
Affiliations:
1Max Planck Institute for Dynamics and Self-Organization, Max Planck Society, ou_2063285              

Content

show
hide
Free keywords: -
 Abstract: The recent success of neural networks in natural language processing has drawn renewed attention to learning sequence-to-sequence (seq2seq) tasks. While there exists a rich literature that studies classification and regression tasks using solvable models of neural networks, seq2seq tasks have not yet been studied from this perspective. Here, we propose a simple model for a seq2seq task that has the advantage of providing explicit control over the degree of memory, or non-Markovianity, in the sequences—the stochastic switching-Ornstein–Uhlenbeck (SSOU) model. We introduce a measure of non-Markovianity to quantify the amount of memory in the sequences. For a minimal auto-regressive (AR) learning model trained on this task, we identify two learning regimes corresponding to distinct phases in the stationary state of the SSOU process. These phases emerge from the interplay between two different time scales that govern the sequence statistics. Moreover, we observe that while increasing the integration window of the AR model always improves performance, albeit with diminishing returns, increasing the non-Markovianity of the input sequences can improve or degrade its performance. Finally, we perform experiments with recurrent and convolutional neural networks that show that our observations carry over to more complicated neural network architectures.

Details

show
hide
Language(s): eng - English
 Dates: 2024-03-21
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1088/2632-2153/ad2feb
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Machine Learning: Science and Technology
  Abbreviation : Mach. Learn.: Sci. Technol.
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Bristol, UK : IOP Publishing
Pages: - Volume / Issue: 5 (1) Sequence Number: 015053 Start / End Page: - Identifier: ISSN: 2632-2153
CoNE: https://pure.mpg.de/cone/journals/resource/2632-2153