hide
Free keywords:
-
Abstract:
Sequences of visual flashes, tactile taps, and auditory beeps were presented simultaneously. For each session, subjects were instructed to count the number of events presented in one modality (focal modality) and to ignore the other modalities (background). The number of events presented in the background modality(ies) could differ from the number of events in the focal modality. The experiment consisted of nine different sessions, all nine combinations between visual, tactile, and auditory signals being tested. In each session, the perceived number of events in the focal modality was significantly influenced by the background signal(s). The visual modality, which had the largest intrinsic variance (focal modality presented alone), was the most susceptible to background-evoked bias and the less efficient in biasing the other two modalities. Conversely, the auditory modality, which had the smallest intrinsic variance, was the less susceptible to background-evoked bias and the most efficient in biasing the othe
r two modalities. These results show that visual, tactile, and auditory sensory signals tend to be automatically integrated for the perception of sequences of events. They also suggest that the relative weight of each sensory signal in the integration process depends on its intrinsic relative reliability.