English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Vision and touch are automatically integrated for the perception of sequences of events

Bresciani, J.-P., Dammeier, F., & Ernst, M. (2006). Vision and touch are automatically integrated for the perception of sequences of events. Journal of Vision, 6(5), 554-564. doi:10.1167/6.5.2.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-0013-D235-2 Version Permalink: http://hdl.handle.net/21.11116/0000-0006-C4DB-6
Genre: Journal Article

Files

show Files

Locators

show
hide
Description:
-

Creators

show
hide
 Creators:
Bresciani, J-P1, 2, Author              
Dammeier, F1, 2, Author              
Ernst, MO1, 2, Author              
Affiliations:
1Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: The purpose of the present experiment was to investigate the integration of sequences of visual and tactile events. Participants were presented with sequences of visual flashes and tactile taps simultaneously and instructed to count either the flashes (session 1) or the taps (session 2). The number of flashes could differ from the number of taps by ±1. For both sessions, the perceived number of events was significantly influenced by the number of events presented in the task-irrelevant modality. Touch had a stronger influence on vision than vision on touch. Interestingly, touch was the more reliable of the two modalities – less variable estimates when presented alone. For both sessions, the perceptual estimates were less variable when stimuli were presented in both modalities than when the task-relevant modality was presented alone. These results indicate that even when one signal is explicitly task-irrelevant, sensory information tends to be automatically integrated across modalities. They also suggest tha t the relative weight of each sensory channel in the integration process depends on its relative reliability. The results are described using a Bayesian probabilistic model for multimodal integration that accounts for the coupling between the sensory estimates.

Details

show
hide
Language(s):
 Dates: 2006-04
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: -
 Identifiers: DOI: 10.1167/6.5.2
BibTex Citekey: 3787
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Journal of Vision
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Charlottesville, VA : Scholar One, Inc.
Pages: - Volume / Issue: 6 (5) Sequence Number: - Start / End Page: 554 - 564 Identifier: ISSN: 1534-7362
CoNE: https://pure.mpg.de/cone/journals/resource/111061245811050