Help Privacy Policy Disclaimer
  Advanced SearchBrowse


  Walking as the ultimate challenge for the multisensory brain

Frissen, I. (2008). Walking as the ultimate challenge for the multisensory brain. Talk presented at York University: Centre for Vision Research. Toronto, ON, Canada. 2008-09-05.

Item is


show Files


Frissen, I1, 2, Author              
1Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497806              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              


Free keywords: -
 Abstract: Walking is arguably the most common of actions for humans, and it seems to be a relatively effortless one as well, you just get up and go. Its apparent ease, however, belies the complexities of the underlying biomechanical, sensory, and even cognitive mechanisms. When we consider the sensory processes that are involved it seems that virtually all available sensory systems contribute at one point or another. But how does the brain deal with this plethora of sensory information? Walking, then, poses the brain with the ultimate multisensory problem. This talk will be in two parts. In the first part, I introduce a new and unique omnidirectional treadmill setup that allows the user to walk through large scale virtual environments in an unconstrained and natural manner. It is in effect the closest thing to a full world simulator presently available and is a groundbreaking step towards understanding the multisensory brain in action. The platform is the result of an international collaboration between leading European research institutes in the fields of mechatronics, (optimal) control, visualization, markerless tracking, and human perception in a project called CyberWalk (www.cyberwalk-project.org). In the second part I will discuss a series of experiments specifically looking at the integration of inertial and kinaesthetic information. Work to date was hampered by the fact that in the real world these two sources of information are confounded with each other. To tackle this problem we employed a circular treadmill with an active handlebar. With such a setup one can finally decouple the two sources of sensory input and study their relative contributions. We employed a variety of paradigms. In one we used a novel spatial updating paradigm. In this particular case, we also addressed specific questions that are still open in that field. I will also report on results from magnitude estimation and 2IFC type experiments. The general finding is that also for the case of such a complex action such as walking the brain deals with the incoming information in a way reminiscent of more 'conventional' forms of multisensory interactions. Thus, our results are consistent with a weighted average type integration mechanism with the weights being determined by the relative reliability of the sensory information. But these are still just the first steps. With the help of the aforementioned setups we are now able to go beyond the investigation of just two streams of information and look at the interaction between three or even four of them. Thus we will finally truly start looking at the multisensory brain in action.


 Dates: 2008-09
 Publication Status: Published online
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: -
 Degree: -


Title: York University: Centre for Vision Research
Place of Event: Toronto, ON, Canada
Start-/End Date: 2008-09-05
Invited: Yes

Legal Case


Project information