English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Functional neuroimaging of sound motion in the macaque dorsal stream

MPS-Authors
/persons/resource/persons192788

Ortiz,  M
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84237

Steudel,  T
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83787

Augath,  MA
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84063

Logothetis,  NK
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Ortiz, M., Steudel, T., Augath, M., Logothetis, N., & Rauschecker, J. (2012). Functional neuroimaging of sound motion in the macaque dorsal stream. Poster presented at 4th International Conference on Auditory Cortex (ICAC 2012), Lausanne, Switzerland.


Cite as: http://hdl.handle.net/21.11116/0000-0001-9C24-7
Abstract
The macaque ventral intraparietal area (VIP), located in the fundus of the intraparietal sulcus (IPS), is considered a polymodal association area that responds to visual, tactile, vestibular and auditory stimuli (Schlack et al., 2005). In particular, VIP neurons are responsive to moving visual and auditory stimuli. VIP receives projections from multiple visual areas (especially from the middle temporal area [MT] and the medial superior temporal complex [MST]) and from auditory regions in the posterior superior temporal (pST) cortex (Lewis & Van Essen, 2000). Neurons in pST, in particular the caudolateral area (CL), show selective responses to particular sound locations regardless of sound type (Tian et al., 2001; Recanzone, 2001). In humans, several studies have reported activation of the pST and IPS to sound source motion (Warren et al., 2000; Krumbholz et al., 2005), confirming the existence of a dorsal processing stream for spatial aspects of sound in humans. In order to bridge the gap between single-unit recordings in monkeys and neuroimaging studies in humans, we used high-resolution fMRI in monkeys to further investigate these results. First, we created a virtual auditory space environment using binaural sound recording techniques with miniature microphones inserted into a macaque head cast. We validated the technique by measuring saccadic eye movements to sound sources in different locations during playback. We then performed fMRI to identify cortical areas sensitive to sound motion in azimuth of the left and right hemifields. All fMRI data were pre-processed and aligned with the 112RM-SL_T1 rhesus monkey template for identification of cortical fields (McLaren et al., 2009). Preliminary results showed that all moving sounds activated areas MT, MST and the IPS. Contrasting left and right sound-motion conditions against center (i.e. no motion) yielded greater activation in contralateral VIP. These results suggest that interaural information induced by lateralized sounds is processed along a dorsal cortical processing stream comprising VIP in the respective contralateral hemisphere.