English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Meeting Abstract

Coding and predicting the resolution of temporal discrepancies across the senses

MPS-Authors
/persons/resource/persons83955

Hartcher O'Brien,  J
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Hartcher O'Brien, J. (2011). Coding and predicting the resolution of temporal discrepancies across the senses. In TIMELY Workshop on the “Psychophysical, Computational, and Neuroscience Models of Time Perception” (pp. 57-58).


Cite as: https://hdl.handle.net/21.11116/0000-0002-54C6-0
Abstract
Time is an essential dimension of human experience, yet our understanding of how temporal signals are combined across sensory modalities and coded remains unclear. The aim of the three studies presented is to determine whether temporal cues are combined in (1) the statistically optimal way such as described by the Maximum Likelihood Estimation (MLE) model. (2) to assess whether temporal features such as the onset, or offset of a signal are systematically related to the impulse response functions for the relevant stimuli (3) how the brain adapts to asynchrony between what we see and hear and recalibrates to maintain temporal coincidence of an audiovisual event. In experiment 1: using a 2 interval forced choice (2IFC) procedure observers estimated the duration or time course of events. They indicated which interval was longer, the first or the second. We manipulated the reliability of the auditory signal to provide a diverse weighting scheme for visual and auditory components so that we could test the integration strategy of a statistically optimal observer (MLE) against our empirical bimodal data. We found that for signals of equal weight MLE the model predicts the perceived duration and we get an optimal decrease in the noise in duration estimates. Breaking down perceived duration to the estimation of different time points, i.e. onset, peak amplitude and offset of a Gaussian signal, the second study investigated the type of filter or transfer function that describes the difference in physical and perceived time. We explored whether such perceptual estimates of temporal points (onset, peak amplitude & offset) strongly deviating from physical time, and whether this is consistent across sensory modalities. Participants undertook a temporal order judgment task and estimated onset, peak amplitude and offset of a long Gaussian signal (sigma 150ms) by comparing it with a short spike‐like comparison stimulus (sigma 5ms). The stimulus configurations tested were unimodal visual with long Gaussian (V) and short Gaussian signal (v), unimodal auditory (A‐a) and crossmodal (A‐v, V‐a) stimulus randomised within the blocks. Results demonstrate that temporal (PSS) estimates depend on the modality of the long stimulus: Onset PSS for (A‐a) & (A‐v) occurred earlier than for (A‐v) and (V‐a). For offset estimates the A‐a and A‐v conditions were perceived as later than the V‐v and V‐a configurations. However, peak amplitude estimates produced an amodal pattern with all estimates aligned but perceived earlier than physical peak. For long auditory signals, the perceived duration of the temporal event was overestimated and for long visual signals, was perceived as shortened. Discrimination thresholds were better for the long visual signals than for long auditory signals irrespective of what point in the temporal event was estimated. Such differences can potentially be used to explain illusions such as the flash lag effect and discrepancies in perceived duration across the senses. We quantitatively explain these effects using models of signal processing. In the third experiment looking at recalibration we explored which sensory modality changes as a function of adaptation to asynchrony. We measured response times and our results revealed that RTs to sounds became progressively faster when exposed to visual leading asunchrony or slower in the reverse condition as people's exposure to asynchrony increased, thus providing the first empirical indication that our speeded responses to sounds are influenced by exposure to audiovisual asynchrony. Our results suggest that time is coded in a signal dependent matter but that changes in integration of the temporal information are plastic and can occur online.