日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

講演

Naturalistic, metaphoric and linguistic auditory-visual interactions

MPS-Authors
/persons/resource/persons84182

Sadaghiani,  S
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84112

Noppeney,  U
Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
There are no locators available
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)
公開されているフルテキストはありません
付随資料 (公開)
There is no public supplementary material available
引用

Sadaghiani, S., & Noppeney, U. (2007). Naturalistic, metaphoric and linguistic auditory-visual interactions. Talk presented at 37th Annual Meeting of the Society for Neuroscience (Neuroscience 2007). San Diego, CA, USA.


引用: https://hdl.handle.net/11858/00-001M-0000-0013-CB65-9
要旨
To form a coherent percept of our dynamic environment, the brain merges motion information from the auditory and visual senses. Yet, not only ‘naturalistic’ direction information of auditory motion, but also ‘metaphoric’ direction information of dynamic pitch has been shown to influence visual motion discrimination. Here, we systematically investigate the neural systems that mediate auditory influences on visual motion discrimination in naturalistic, metaphoric and linguistic contexts. In a visual selective attention paradigm, subjects discriminated the direction of visual motion at several levels of ambiguity, while ignoring a simultaneous auditory stimulus that could be either congruent, absent or incongruent. Audio-visual congruency was defined at the 1) naturalistic, 2) metaphoric and 3) linguistic levels using three classes of auditory stimuli: 1) MOTION: left vs. right moving white noise, 2) PITCH: rising vs. falling pitch and 3) SPEECH: spoken German words denoting directions e.g. ‘links’ vs. ‘rechts’. At the behavioral level, all three classes of auditory stimuli induced a directional bias. Furthermore, this bias was not significantly different across contexts. At the neural level, the auditory influence on visual motion processing was identified through (1) the interaction between visual ambiguity and audition (presence vs. absence) and (2) the incongruency effect, separately for MOTION, PITCH and SPEECH. A significant interaction was revealed for MOTION in left hMT+/V5 and for SPEECH in right intraparietal sulcus. An incongruency effect was only observed for SPEECH in the left superior temporal gyrus and right middle frontal gyrus. Direct comparisons across contexts confirmed this functional dissociation: The interaction effect gradually decreased in left hMT+/V5 for MOTION>PITCH>SPEECH and in right IPS for SPEECH>PITCH>MOTION. Our results suggest that audition can influence visual motion discrimination at the naturalistic, metaphoric and linguistic levels. Yet, even though the auditory bias was comparable across contexts, our functional imaging results suggest that they are mediated by different neural systems. While naturalistic influences emerge in motion processing areas, linguistic interactions are revealed primarily in higher-level fronto-parietal regions.