hide
Free keywords:
-
Abstract:
We investigated the interactions between visual, tactile and auditory sensory signals for the perception of sequences of events. Sequences of
flashes, taps and beeps were presented simultaneously. For each session, subjects were instructed to count the number of events presented in one
modality (Target) and to ignore the stimuli presented in the other modalities (Background). The number of events presented in the background
sequence could differ from the number of events in the target sequence. For each session, we quantified the Background-evoked bias by comparing
subjects responses with and without Background (Target presented alone). Nine combinations between vision, touch and audition were tested.
In each session but two, the Background significantly biased the Target. Vision was the most susceptible to Background-evoked bias and the
least efficient in biasing the other two modalities. By contrast, audition was the least susceptible to Background-evoked bias and the most efficient
in biasing the other two modalities. These differences were strongly correlated to the relative reliability of each modality. In line with this, the
evoked biases were larger when the Background consisted of two instead of only one modality.
These results show that for the perception of sequences of events: (1) vision, touch and audition are automatically integrated; (2) the respective
contributions of the three modalities to the integrated percept differ; (3) the relative contribution of each modality depends on its relative reliability
(1/variability); (4) task-irrelevant stimuli have more weight when presented in two rather than only one modality.