Help Privacy Policy Disclaimer
  Advanced SearchBrowse




Journal Article

Coding of Melodic Gestalt in Human Auditory Cortex


Herdener,  M
Department High-Field Magnetic Resonance, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available

Schindler, A., Herdener, M., & Bartels, A. (2013). Coding of Melodic Gestalt in Human Auditory Cortex. Cerebral Cortex, 23(12), 2987-2993. doi:10.1093/cercor/bhs289.

Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-B616-8
The perception of a melody is invariant to the absolute properties of its constituting notes, but depends on the relation between them—the melody's relative pitch profile. In fact, a melody's “Gestalt” is recognized regardless of the instrument or key used to play it. Pitch processing in general is assumed to occur at the level of the auditory cortex. However, it is unknown whether early auditory regions are able to encode pitch sequences integrated over time (i.e., melodies) and whether the resulting representations are invariant to specific keys. Here, we presented participants different melodies composed of the same 4 harmonic pitches during functional magnetic resonance imaging recordings. Additionally, we played the same melodies transposed in different keys and on different instruments. We found that melodies were invariantly represented by their blood oxygen level–dependent activation patterns in primary and secondary auditory cortices across instruments, and also across keys. Our findings extend common hierarchical models of auditory processing by showing that melodies are encoded independent of absolute pitch and based on their relative pitch profile as early as the primary auditory cortex.