English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Tracking human skill learning with a hierarchical Bayesian sequence model

Elteto, N., Nemeth, D., Janacsek, K., & Dayan, P. (2022). Tracking human skill learning with a hierarchical Bayesian sequence model. Poster presented at Computational and Systems Neuroscience Meeting (COSYNE 2022), Lisboa, Portugal.

Item is

Files

show Files

Creators

show
hide
 Creators:
Elteto, N1, Author           
Nemeth, D, Author
Janacsek, K, Author
Dayan, P1, Author           
Affiliations:
1Department of Computational Neuroscience, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_3017468              

Content

show
hide
Free keywords: -
 Abstract: Perceptuo-motor sequences that underlie our everyday skills from walking to language have higher-order dependencies such that the statistics of one sequence element depend on a variably deep window of past elements. We used a non-parametric, hierarchical, forgetful, Bayesian sequence model to characterize the multi-day evolution of human participants’ implicit representation of a serial reaction time task sequence with higher-order dependencies. The model updates trial-by-trial, and seamlessly combines predictive information from shorter and longer windows onto past events, weighting the windows proportionally to their predictive power. We fitted the model to participants’ response times (RTs), assuming that faster responses reflected more certain predictions of the upcoming elements. Already in the first session, the model fit showed that participants had begun to rely on two previous elements (i.e., trigrams) for prediction, thereby successfully adapting to the higher-order task structure. However, at this early stage, local histories influenced their responses, correctly captured by forgetting in the model. With training, forgetting of trigrams was reduced, so that RTs were more robust to local statistical fluctuations – evidence of skilled performance. However, error responses still reflected forgetting-induced volatility of the internal model. By the last training session, a subset of participants shifted their prior further to consider a context even deeper than just two previous elements. Our model was able to predict the degree to which individuals enriched their internal model to represent dependencies of increasing orders.

Details

show
hide
Language(s):
 Dates: 2022-03
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: -
 Degree: -

Event

show
hide
Title: Computational and Systems Neuroscience Meeting (COSYNE 2022)
Place of Event: Lisboa, Portugal
Start-/End Date: 2022-03-17 - 2022-03-20

Legal Case

show

Project information

show

Source 1

show
hide
Title: Computational and Systems Neuroscience Meeting (COSYNE 2022)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: - Sequence Number: 2-098 Start / End Page: 166 Identifier: -