ausblenden:
Schlagwörter:
-
Zusammenfassung:
Understanding vision has always been at the centre of research in perception and cognition. Experiments on vision, however, have usually been conducted with a strong focus on perception, neglecting the fact that in most natural tasks sensory signals are not ultimately used for perception, but rather for action. The effects of the action are sensed by the sensory system, so that perception and action are complementary parts of a dynamic control system. Additionally, the human sensory system receives input from multiple senses which have to be integrated in order to solve tasks ranging from standing upright to controlling complex vehicles. In our Cybernetics research group we use psychophysical, physiological, modeling, and simulation techniques to study how cues from different sensory modalities are integrated by the brain to perceive, act in, and interact with the real world. In psychophysical studies, we could show that humans integrate multimodal sensory information often, but not always, in a statistically optimal way such that cues are weighted according to their reliability. In this talk, I will present results from our studies on multisensory integration of perception and action in both natural and simulated environments for different tasks using our latest simulator technologies, the Cyberwalk omnidirectional treadmill and the MPI Motion Simulator based on a large industrial robot arm.