English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Automatic integration of visual, tactile and auditory signals for the perception of sequences of events

MPS-Authors
/persons/resource/persons83831

Bresciani,  J-P
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83874

Dammeier,  F
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83906

Ernst,  MO
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Bresciani, J.-P., Dammeier, F., & Ernst, M. (2006). Automatic integration of visual, tactile and auditory signals for the perception of sequences of events. Poster presented at 29th European Conference on Visual Perception (ECVP 2006), St. Petersburg, Russia.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-D0A1-B
Abstract
Sequences of visual flashes, tactile taps, and auditory beeps were presented simultaneously. For each session, subjects were instructed to count the number of events presented in one modality (focal modality) and to ignore the other modalities (background). The number of events presented in the background modality(ies) could differ from the number of events in the focal modality. The experiment consisted of nine different sessions, all nine combinations between visual, tactile, and auditory signals being tested. In each session, the perceived number of events in the focal modality was significantly influenced by the background signal(s). The visual modality, which had the largest intrinsic variance (focal modality presented alone), was the most susceptible to background-evoked bias and the less efficient in biasing the other two modalities. Conversely, the auditory modality, which had the smallest intrinsic variance, was the less susceptible to background-evoked bias and the most efficient in biasing the othe
r two modalities. These results show that visual, tactile, and auditory sensory signals tend to be automatically integrated for the perception of sequences of events. They also suggest that the relative weight of each sensory signal in the integration process depends on its intrinsic relative reliability.