English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Talk

Walking as the ultimate challenge for the multisensory brain

MPS-Authors
/persons/resource/persons83920

Frissen,  I
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Frissen, I. (2008). Walking as the ultimate challenge for the multisensory brain. Talk presented at York University: Centre for Vision Research. Toronto, ON, Canada. 2008-09-05.


Cite as: https://hdl.handle.net/21.11116/0000-0003-A560-6
Abstract


Walking is arguably the most common of actions for humans, and it seems to be a relatively effortless one as well, you just get up and go. Its apparent ease, however, belies the complexities of the underlying biomechanical, sensory, and even cognitive mechanisms. When we consider the sensory processes that are involved it seems that virtually all available sensory systems contribute at one point or another. But how does the brain deal with this plethora of sensory information? Walking, then, poses the brain with the ultimate multisensory problem.

This talk will be in two parts. In the first part, I introduce a new and unique omnidirectional treadmill setup that allows the user to walk through large scale virtual environments in an unconstrained and natural manner. It is in effect the closest thing to a full world simulator presently available and is a groundbreaking step towards understanding the multisensory brain in action. The platform is the result of an international collaboration between leading European research institutes in the fields of mechatronics, (optimal) control, visualization, markerless tracking, and human perception in a project called CyberWalk (www.cyberwalk-project.org).

In the second part I will discuss a series of experiments specifically looking at the integration of inertial and kinaesthetic information. Work to date was hampered by the fact that in the real world these two sources of information are confounded with each other. To tackle this problem we employed a circular treadmill with an active handlebar. With such a setup one can finally decouple the two sources of sensory input and study their relative contributions. We employed a variety of paradigms. In one we used a novel spatial updating paradigm. In this particular case, we also addressed specific questions that are still open in that field. I will also report on results from magnitude estimation and 2IFC type experiments.

The general finding is that also for the case of such a complex action such as walking the brain deals with the incoming information in a way reminiscent of more 'conventional' forms of multisensory interactions. Thus, our results are consistent with a weighted average type integration mechanism with the weights being determined by the relative reliability of the sensory information.

But these are still just the first steps. With the help of the aforementioned setups we are now able to go beyond the investigation of just two streams of information and look at the interaction between three or even four of them. Thus we will finally truly start looking at the multisensory brain in action.