English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Tri-modal integration of visual, tactile and auditory signals for the perception of sequences of events

Bresciani, J.-P., Dammeier, F., & Ernst, M. (2008). Tri-modal integration of visual, tactile and auditory signals for the perception of sequences of events. Brain Research Bulletin, 75(6), 753-760. doi:10.1016/j.brainresbull.2008.01.009.

Item is

Files

show Files

Locators

show
hide
Description:
-
OA-Status:

Creators

show
hide
 Creators:
Bresciani, J-P1, 2, Author           
Dammeier, F1, 2, 3, Author           
Ernst, MO2, 3, Author           
Affiliations:
1Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              
3Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497806              

Content

show
hide
Free keywords: -
 Abstract: We investigated the interactions between visual, tactile and auditory sensory signals for the perception of sequences of events. Sequences of flashes, taps and beeps were presented simultaneously. For each session, subjects were instructed to count the number of events presented in one modality (Target) and to ignore the stimuli presented in the other modalities (Background). The number of events presented in the background sequence could differ from the number of events in the target sequence. For each session, we quantified the Background-evoked bias by comparing subjects’ responses with and without Background (Target presented alone). Nine combinations between vision, touch and audition were tested.

In each session but two, the Background significantly biased the Target. Vision was the most susceptible to Background-evoked bias and the least efficient in biasing the other two modalities. By contrast, audition was the least susceptible to Background-evoked bias and the most efficient in biasing the other two modalities. These differences were strongly correlated to the relative reliability of each modality. In line with this, the evoked biases were larger when the Background consisted of two instead of only one modality.

These results show that for the perception of sequences of events: (1) vision, touch and audition are automatically integrated; (2) the respective contributions of the three modalities to the integrated percept differ; (3) the relative contribution of each modality depends on its relative reliability (1/variability); (4) task-irrelevant stimuli have more weight when presented in two rather than only one modality.

Details

show
hide
Language(s):
 Dates: 2008-04
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Brain Research Bulletin
  Other : Brain Res. Bull.
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Phoenix, N.Y. : Elsevier
Pages: - Volume / Issue: 75 (6) Sequence Number: - Start / End Page: 753 - 760 Identifier: ISSN: 0361-9230
CoNE: https://pure.mpg.de/cone/journals/resource/954925522699