hide
Free keywords:
-
Abstract:
Humans gather information about their environment from multiple sensory channels. It seems that cues from separate sensory modalities (e.g., vision and haptics) are combined in a statistically optimal way according to a maximum-likelihood estimator (Ernst Banks, 2002). Ernst and Banks showed that for bi-modal perceptual estimates, the weight attributed to one sensory channel changes when its relative reliability is modified by increasing the noise associated to its signal.
Here we address the question as to whether selectively increasing the attentional load of one sensory channel does affect the weighting of cues from different sensory channels.
In our experiment, subjects main-task was to estimate the size of a raised bar using vision alone, haptics alone, or both modalities combined. Their performance in the main-task condition alone is compared to the performance obtained when a concurrent visual distractor-task is performed. We found that vision-based estimates are more affected by a visual distractor than the haptics-based estimates. Thus, attention is indeed selectively detracted from the visual modality. Moreover, we found that the cue weighting is not affected by adding the visual distractor-task.
Therefore we can conclude that multisensory integration occurs at an early stage of processing and is not affected by attention.