English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Crossmodal correspondences

Spence, C., Parise, C., & Deroy, O. (2011). Crossmodal correspondences. i-Perception, 2(8), 887.

Item is

Files

show Files

Locators

show
hide
Description:
-

Creators

show
hide
 Creators:
Spence, C, Author
Parise, CV1, 2, Author              
Deroy, O, Author
Affiliations:
1Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497806              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

Content

show
hide
Free keywords: -
 Abstract: In many everyday situations, our senses are bombarded by numerous different unisensory signals at any given time. In order to gain the most veridical, and least variable, estimate of environmental stimuli/properties, we need to combine the individual noisy unisensory perceptual estimates that refer to the same object, while keeping those estimates belonging to different objects or events separate. How, though, does the brain ‘know’ which stimuli to combine? Traditionally, researchers interested in the crossmodal binding problem have focused on the role that spatial and temporal factors play in modulating multisensory integration. However, crossmodal correspondences between various unisensory features (such as between auditory pitch and visual size) may provide yet another important means of constraining the crossmodal binding problem. A large body of research now shows that people exhibit consistent crossmodal correspondences between many stimulus features in different sensory modalities. So, for example, people will consistently match high-pitched sounds with small, bright objects that are located high up in space. In this talk, the latest literature is reviewed. We will argue that crossmodal correspondences need to be considered alongside semantic and spatiotemporal congruency, among the key constraints that help our brains to solve the crossmodal binding problem. Crossmodal correspondences will also be distinguished from synaesthesia.

Details

show
hide
Language(s):
 Dates: 2011-10
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: BibTex Citekey: SpencePD2011
DOI: 10.1068/ic887
 Degree: -

Event

show
hide
Title: 12th International Multisensory Research Forum (IMRF 2011)
Place of Event: Fukuoka, Japan
Start-/End Date: 2011-10-17 - 2011-10-20

Legal Case

show

Project information

show

Source 1

show
hide
Title: i-Perception
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 2 (8) Sequence Number: - Start / End Page: 887 Identifier: -