Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Temporal calibration between the visual, auditory and tactile senses: A psychophysical approach

MPG-Autoren
/persons/resource/persons84065

Machulla,  T
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83885

Di Luca,  M
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83906

Ernst,  M
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Machulla, T., Di Luca, M., & Ernst, M. (2007). Temporal calibration between the visual, auditory and tactile senses: A psychophysical approach. Poster presented at 1st Peach Summer School, Santorini, Greece.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-CD1B-1
Zusammenfassung
Human observers acquire information about physical properties of the environment through different sensory
modalities. For natural events, these sensory signals show a specific temporal, spatial and contextual configuration that
aids the integration into a coherent multisensory percept. For multimodal virtual environments, however, signals have
to be created and displayed separately for different modalities, which may result in a miscalibration of these signals.
This, in turn, can greatly reduce the observer’s sense of immersion and presence.
Using psychophysical methods, we investigate fundamental questions regarding how the temporal alignment of signals
from the visual, auditory and tactile modalities is achieved. A first project examines the perception of subjective
simultaneity of signals. Simultaneity detection poses a non-trivial matching problem to the human brain: physical and
neural transmission times differ greatly between the senses. As there is only partial compensation for these differential
delays, subjective simultaneity may result from presenting stimuli with a physical delay. Here, we are interested in
whether this phenomenon reflects an amodal timing mechanism that works across all modalities in a uniform fashion.
Further, we look at the sensitivity for asynchrony detection for different modality pairs as well as at interindividual
differences.
In a second project, we examine the ability of the human cognitive system to adapt to asynchronous information in
different modalities. Adaptation may be used to reduce the disruptive effects of temporal miscalibration between
signals in different modalities. We are interested in the strength of adaptation as well as the mechanism underlying this
effect.
Future projects aim at the investigation of
- the precise relationship between the perception of synchrony and multimodal integration,
- the influence of prior knowledge about a common origin of signals on the perception of synchrony
- the influence of timing on the perception of cause and effect
- the neural basis of the detection of synchrony
In conclusion, our research seeks to understand the mechanisms underlying temporal calibration between different
sensory modalities with the goal to identify factors that foster multimodal integration and, in turn, the sense of
presence.