日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

ポスター

Multisensory Interactions in Auditory Cortex

MPS-Authors
/persons/resource/persons84006

Kayser,  C
Research Group Physiology of Sensory Integration, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Physiology of Sensory Integration, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84136

Petkov,  C
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84063

Logothetis,  NK
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
There are no locators available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)
公開されているフルテキストはありません
付随資料 (公開)
There is no public supplementary material available
引用

Kayser, C., Petkov, C., & Logothetis, N. (2007). Multisensory Interactions in Auditory Cortex. Poster presented at 10th Tübinger Wahrnehmungskonferenz (TWK 2007), Tübingen, Germany.


引用: https://hdl.handle.net/11858/00-001M-0000-0013-CCF3-1
要旨
An increasing body of literature provides compelling evidence that sensory convergence not only occurs in higher association areas, but also in lower sensory regions and even in primary sensory cortices. To scrutinize these early cross-modal interactions, we use the macaque auditory cortex as model and employ combinations of high-resolution functional imaging (fMRI) and electrophysiological recordings. Using function imaging in alert and anaesthetized animals, we reported that (only) caudal auditory fields are susceptible to cross-modal modulation: The fMRI-BOLD response in these regions was enhanced when auditory stimuli were complemented by simultaneous visual or touch stimulation [1,2]. To investigate the neuronal basis of this cross-modal enhancement, we recorded the activity of local field potentials and single units in alert animals watching complex audio-visual scenes. Our results show the following: Visual stimuli by themselves, on average, do not drive auditory neurons, but cause responses in low frequency LFPs. Combining visual and auditory stimuli leads to enhanced responses in the low frequency LFP, but to a reduction of firing rates. This audio-visual interaction was significant at the population level, and for about 10 of the neurons when tested individually. The interaction occurs only for well-timed visual stimuli, is strongest when the visual stimulus leads the auditory stimulus by 20–80msec, but is independent of the image structure in the visual stimulus. Similar visual modulation was found in the auditory core and belt. Our findings point to a very basic, stimulus unspecific visual input to auditory cortex and clearly support the notion that early sensory cortices are susceptible to cross-modal interactions. Especially, the finding that visual stimuli modulate the firing rates of individual neurons in auditory cortex suggests that the messages transmitted from these regions to higher processing stages do not only reflect acoustical stimuli but are also dependent on their visual context.