English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Integration of visual and inertial cues in the perception of angular self-motion

MPS-Authors
/persons/resource/persons84229

Soyka,  F
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83796

Barnett-Cowan,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

de Winkel, K., Soyka, F., Barnett-Cowan, M., Bülthoff, H., Groen, E., & Werkhoven, P. (2013). Integration of visual and inertial cues in the perception of angular self-motion. Experimental Brain Research, 231(2), 209-218. doi:10.1007/s00221-013-3683-1.


Cite as: https://hdl.handle.net/11858/00-001M-0000-001A-12AD-A
Abstract
The brain is able to determine angular self-motion from visual, vestibular, and kinesthetic information. There is compelling evidence that both humans and non-human primates integrate visual and inertial (i.e., vestibular and kinesthetic) information in a statistically optimal fashion when discriminating heading direction. In the present study, we investigated whether the brain also integrates information about angular self-motion in a similar manner. Eight participants performed a 2IFC task in which they discriminated yaw-rotations (2-s sinusoidal acceleration) on peak velocity. Just-noticeable differences (JNDs) were determined as a measure of precision in unimodal inertial-only and visual-only trials, as well as in bimodal visual–inertial trials. The visual stimulus was a moving stripe pattern, synchronized with the inertial motion. Peak velocity of comparison stimuli was varied relative to the standard stimulus. Individual analyses showed that data of three participants showed an increase in bimodal precision, consistent with the optimal integration model; while data from the other participants did not conform to maximum-likelihood integration schemes. We suggest that either the sensory cues were not perceived as congruent, that integration might be achieved with fixed weights, or that estimates of visual precision obtained from non-moving observers do not accurately reflect visual precision during self-motion.