English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Eye movement patterns when playing from memory: Examining consistency across repeated performances and the relationship between eyes and audio

Fink, L. (2023). Eye movement patterns when playing from memory: Examining consistency across repeated performances and the relationship between eyes and audio. In M. Tsuzaki, M. Sadakata, S. Ikegami, T. Matsui, M. Okano, & H. Shoda (Eds.), The e-proceedings of the 17th International Conference on Music Perception and Cognition and the 7th Conference of the Asia-Pacific Society for the Cognitive Sciences of Music.

Item is

Basic

show hide
Genre: Conference Paper

Files

show Files

Locators

show

Creators

show
hide
 Creators:
Fink, Lauren1, 2, Author                 
Affiliations:
1Department of Music, Max Planck Institute for Empirical Aesthetics, Max Planck Society, ou_2421696              
2Dept. of Psychology, Neuroscience, & Behaviour, McMaster University, Hamilton, Ontario, Canada, ou_persistent22              

Content

show
hide
Free keywords: memory, piano, eye-tracking, dynamic time warping, matrix profile, motif, audio indexing
 Abstract: While the eyes serve an obvious function in the context of music reading, their role during memorized music
performance (i.e., when there is no score) is currently unknown. Given previous work showing relationships
between eye movements and body movements and eye movements and memory retrieval, here I ask 1) whether
eye movements become a stable aspect of the memorized music (motor) performance, and 2) whether the structure
of the music is reflected in eye movement patterns. In this case study, three pianists chose two pieces to play from
memory. They came into the lab on four different days, separated by at least 12hrs, and played their two pieces
three times each. To answer 1), I compared dynamic time warping cost within vs. between pieces, and found
significantly lower warping costs within piece, for both horizontal and vertical eye movement time series,
providing a first proof-of-concept that eye movement patterns are conserved across repeated memorized music
performances. To answer 2), I used the Matrix Profiles of the eye movement time series to automatically detect
motifs (repeated patterns). By then analyzing participants’ recorded audio at moments of detected ocular motifs,
repeated sections of music could be identified (confirmed auditorily and with inspection of the extracted pitch and
amplitude envelopes of the indexed audio snippets). Overall, the current methods provide a promising approach
for future studies of music performance, enabling exploration of the relationship between body movements, eye
movements, and musical processing.

Details

show
hide
Language(s): eng - English
 Dates: 2023-08-23
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: -
 Degree: -

Event

show
hide
Title: 17th ICMPC
Place of Event: Nihon University College of Art – Ekoda Campus, Tokyo, Japan; hybrid participation
Start-/End Date: 2023-08-24 - 2023-08-28

Legal Case

show

Project information

show

Source 1

show
hide
Title: The e-proceedings of the 17th International Conference on Music Perception and Cognition and the 7th Conference of the Asia-Pacific Society for the Cognitive Sciences of Music
Source Genre: Proceedings
 Creator(s):
Tsuzaki, Minoru, Editor
Sadakata, Makiko, Editor
Ikegami, Shimpei, Editor
Matsui, Toshie, Editor
Okano, Masahiro , Editor
Shoda , Haruka, Editor
Affiliations:
-
Publ. Info: -
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: - Identifier: -