Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

 
 
DownloadE-Mail
  Audio-visual integration during multisensory object categorization

Werner, S., & Noppeney, U. (2006). Audio-visual integration during multisensory object categorization. Poster presented at 7th International Multisensory Research Forum (IMRF 2006), Dublin, Ireland. Retrieved from http://imrf.mcmaster.ca/IMRF/2006/viewabstract.php?id=124.

Item is

Externe Referenzen

einblenden:
ausblenden:
Beschreibung:
-
OA-Status:

Urheber

einblenden:
ausblenden:
 Urheber:
Werner, S1, 2, Autor           
Noppeney, U1, 2, Autor           
Affiliations:
1Research Group Cognitive Neuroimaging, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497804              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Inhalt

einblenden:
ausblenden:
Schlagwörter: -
 Zusammenfassung: Tools or musical instruments are characterized by their form and sound. We investigated audio-visual integration during semantic categorization by presenting pictures and sounds of objects separately or together and manipulating the degree of information content. The 3 x 6 factorial design manipulated (1) auditory information (sound, noise, silence) and (2) visual information (6 levels of image degradation). The visual information was degraded by manipulating the amount of phase scrambling of the image (0, 20, 40, 60, 80, 100). Subjects categorized stimuli as musical instruments or tools. In terms of accuracy and reaction times (RT), we found significant main effects of (1) visual and (2) auditory information and (3) an interaction between the two factors. The interaction was primarily due to an increased facilitatory effect of sound for the 80 degradation level. Consistently across the first 5 levels of visual degradation, we observed RT improvements for the sound-visual relative to the noise- or sile
nce-visual conditions. Corresponding RT distributions significantly violated the so-called race model inequality across the first 5 percentiles of their cumulative density functions (even when controlling for low-level audio-visual interactions). These results suggest that redundant structural and semantic information is not independently processed but integrated during semantic categorization.

Details

einblenden:
ausblenden:
Sprache(n):
 Datum: 2006-06
 Publikationsstatus: Online veröffentlicht
 Seiten: -
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: -
 Identifikatoren: URI: http://imrf.mcmaster.ca/IMRF/2006/viewabstract.php?id=124
BibTex Citekey: 4342
 Art des Abschluß: -

Veranstaltung

einblenden:
ausblenden:
Titel: 7th International Multisensory Research Forum (IMRF 2006)
Veranstaltungsort: Dublin, Ireland
Start-/Enddatum: 2006-06-18 - 2006-06-21

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle 1

einblenden:
ausblenden:
Titel: 7th International Multisensory Research Forum (IMRF 2006)
Genre der Quelle: Konferenzband
 Urheber:
Affiliations:
Ort, Verlag, Ausgabe: -
Seiten: - Band / Heft: - Artikelnummer: 124 Start- / Endseite: - Identifikator: -