English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  Natural, metaphoric and linguistic auditory-visual interactions

Sadaghiani, S., Maier, J., & Noppeney, U. (2008). Natural, metaphoric and linguistic auditory-visual interactions. Poster presented at 9th International Multisensory Research Forum (IMRF 2008), Hamburg, Germany.

Item is

Files

show Files

Locators

show
hide
Description:
-
OA-Status:

Creators

show
hide
 Creators:
Sadaghiani, S1, 2, 3, Author           
Maier, J2, 3, Author           
Noppeney, U1, 3, Author           
Affiliations:
1Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497804              
2Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              
3Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: To form a coherent percept of our dynamic environment, the brain merges motion information from the auditory and visual senses. Yet, not only auditory motion, but also ‘metaphoric’ pitch has been shown to influence visual motion discrimination. Here, we systematically investigate the neural systems that mediate auditory influences on visual motion discrimination in natural, metaphoric and linguistic contexts. In a visual selective attention paradigm, subjects discriminated the direction of visual motion at several levels of ambiguity, while ignoring a simultaneous auditory stimulus that was 1) ‘natural’ MOTION: left vs. right moving white noise, 2) ‘metaphoric’ PITCH: rising vs. falling pitch or 3) ‘linguistic’ SPEECH: spoken German words denoting directions e.g. ‘links’ vs. ‘rechts’. Behaviourally, all three classes of auditory stimuli induced a comparable directional bias. At the neural level, the interaction between visual ambiguity and audition revealed an auditory influence on visual motion processing for MOTION in left hMT/V5 and for SPEECH in right intraparietal sulcus. Direct comparisons across contexts confirmed this functional dissociation: The interaction effect gradually decreased in left hMT+/V5 for MOTION>PITCH>SPEECH and in right IPS for SPEECH>PITCH>MOTION. In conclusion, while natural audio-visual integration of motion signals emerges in motion processing areas, linguistic interactions are revealed primarily in higher level fronto-parietal regions.

Details

show
hide
Language(s):
 Dates: 2008-07
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: BibTex Citekey: 5479
 Degree: -

Event

show
hide
Title: 9th International Multisensory Research Forum (IMRF 2008)
Place of Event: Hamburg, Germany
Start-/End Date: 2008-07-16 - 2008-07-19

Legal Case

show

Project information

show

Source 1

show
hide
Title: 9th International Multisensory Research Forum (IMRF 2008)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: - Sequence Number: 57 Start / End Page: 132 Identifier: -