Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Konferenzbeitrag

Effect of attention on multimodal cue integration

MPG-Autoren
/persons/resource/persons83960

Helbig,  H
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83906

Ernst,  MO
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine externen Ressourcen hinterlegt
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)

EuroHaptics-2004-Helbig.pdf
(beliebiger Volltext), 879KB

Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Helbig, H., & Ernst, M. (2004). Effect of attention on multimodal cue integration. In M. Buss, & M. Fritschi (Eds.), 4th International Conference EuroHaptics 2004 (pp. 524-527). München, Germany: Institute of Automatic Control Engineering.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-D8E1-8
Zusammenfassung
Humans gather information about their environment from multiple sensory channels. It seems that cues from separate sensory modalities (e.g. vision and haptics) are combined
in a statistically optimal way according to a maximum-likelihood estimator [1]. Ernst and Banks showed that for bi-modal perceptual estimates, the weight attributed
to one sensory channel changes when its relative reliability is modified by increasing the noise associated to its signal. Because increasing the attentional load of a given sensory channel is likely to change its reliability, we assume that such modification would also alter the weight of the different cues for multimodal perceptual estimates. Here we examine this hypothesis using a dual-task paradigm. Subjects’ main-task is to estimate the size of a raised bar using vision alone, haptics alone, or both modalities combined. Their performance in the main-task condition alone is compared to the performance obtained when an additional visual ‘distractor’-task is performed simultaneously to the main-task (Dual-Task Paradigm). We found that vision-based estimates are more affected by a visual ‘distractor’ than the haptics-based estimates. Our findings substantiate that attention influences the weighting of the different sensory channels for multimodal perceptual estimates. That is, when attention is detracted from the visual modality, the haptic estimates are consequently weighted higher in visual-haptic size discrimination. In further experiments, we will examine the influence of a haptic ‘distractor’-task.
We would expect, that a haptic ‘distractor’ interferes to a higher extend with the haptic primary task. The vision-based estimates in the main-task should be less affected. We will then further examine whether cue integration is still statistically optimal.