Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Zeitschriftenartikel

Visual-haptic cue weighting is independent of modality-specific attention

MPG-Autoren
/persons/resource/persons83960

Helbig,  HB
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83906

Ernst,  MO
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine externen Ressourcen hinterlegt
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Helbig, H., & Ernst, M. (2008). Visual-haptic cue weighting is independent of modality-specific attention. Journal of Vision, 8(1:21), 1-16. doi:doi:10.1167/8.1.21.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-CAB3-0
Zusammenfassung
Some object properties (e.g., size, shape, and depth information) are perceived through multiple sensory modalities. Such redundant sensory information is integrated into a unified percept. The integrated estimate is a weighted average of the sensory estimates, where higher weight is attributed to the more reliable sensory signal.Here we examine whether modality-specific attention can affect multisensory integration. Selectively reducing attention in one sensory channel can reduce the relative reliability of the estimate derived from this channel and might thus alter the weighting of the sensory estimates. In the present study, observers performed unimodal (visual and haptic) and bimodal (visual-haptic) size discrimination tasks. They either performed the primary task alone or they performed a secondary task simultaneously (dual task). The secondary task consisted of a same/different judgment of rapidly presented visual letter sequences, and so might be expected to withdraw attention predominantly from the visual rather than the haptic channel.Comparing size discrimination performance in single- and dual-task conditions, we found that vision-based estimates were more affected by the secondary task than the haptics-based estimates, indicating that indeed attention to vision was more reduced than attention to haptics. This attentional manipulation, however, did not affect the cue weighting in the bimodal task. Bimodal discrimination performance was better than unimodal performance in both single- and dual-task conditions, indicating that observers still integrate visual and haptic size information in the dual-task condition, when attention is withdrawn from vision. These findings indicate that visual-haptic cue weighting is independent of modality-specific attention.