Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Meeting Abstract

Multisensory integration in early auditory areas

MPG-Autoren
/persons/resource/persons84006

Kayser,  C
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Kayser, C. (2007). Multisensory integration in early auditory areas. Neural Plasticity, 2007: 23250, 29.


Zitierlink: https://hdl.handle.net/21.11116/0000-0002-D8F9-2
Zusammenfassung
An increasing body of literature from functional imaging, electrophysiology and anatomy provides compelling evidence
that merging of sensory information not only occurs
in higher association areas, but also in lower sensory regions.
To investigate early cross-modal interactions in detail,
we use the macaque auditory cortex as model and employ
a combination of high-resolution imaging (fMRI) and electrophysiological recordings. In the imaging data, few voxels
respond to non-auditory stimulation alone, but many
show cross-modal interactions in the form of supra-linear
enhancement; i.e., the multimodal response exceeds the linear
superposition of the unisensory responses. This effect is
reliably found at the caudal end and along the lateral side of
the secondary auditory cortex, and can be localized to the
medial and caudal belt and caudal parabelt regions. This interaction
obeys the classical rules for sensory integration: it
occurs only for temporally coincident stimuli and follows
the principle of inverse effectiveness (integration is stronger
for less effective stimuli). Complementary electrophysiological
recordings demonstrate that the imaging results are nicely
paralleled by similar findings in the low frequency local field
potentials. Individual neurons, however, often show the opposite
effect and exhibit a decreased response when a visual
stimulus is presented simultaneously with a sound. This
audio-visual depression occurs with a time lag of about 40-80
ms, and for a wide range of simplistic and naturalistic stimuli.
Altogether, our results clearly support the notion that
early sensory cortices are susceptible to modulation by different
senses. However, for individual neurons these effects are
subtle and can be better detected at the level of population
responses. Future studies need to resolve where exactly this
cross-modal input originates and how it aids the auditory
system to segregate our complex acoustical environment.