English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Predicting Change of Vestibular Direction Detection Thresholds from Acceleration Profile Differences

MPS-Authors
/persons/resource/persons84229

Soyka,  F
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83808

Beykirch,  K
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84174

Robuffo Giordano,  P
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Soyka, F., Beykirch, K., Robuffo Giordano, P., & Bülthoff, H. (2010). Predicting Change of Vestibular Direction Detection Thresholds from Acceleration Profile Differences. Poster presented at XXVI Bárány Society Meeting, Reykjavik, Iceland.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-BEE8-C
Abstract
In the absence of vision, the perceived direction of
translational self motion is largely governed by signals
originating from the otoliths. Although it has been
shown that direction detection thresholds depend on
the frequency of the motion stimulus, the influence of
the actual time course of the motion has not been thoroughly
investigated. The goal of our study was to measure,
model and predict vestibular direction detection
thresholds for different motion profiles in the horizontal
plane.
Detection thresholds for three acceleration profiles, one
sinusoidal and two non-sinusoidal (Fig. 1A), with three
different durations were measured for six human participants.
An anthropomorphic robot arm, the Max
Planck CyberMotion Simulator, was used to provide
the motion stimuli. The experiment was designed as a
four-alternative forced-choice task, where blindfolded
participants judged the direction of motion from four
possibilities: forward, backward, left or right. Stimulus
intensity (peak acceleration of the motion profile)
was varied based on a Bayesian adaptive method, and
a psychometric function fit to the measurements determined
the sensory threshold.
For modeling, a 2nd order linear dynamical system
with two poles and one zero, originally proposed by
Young and Meiry (1968), was used to describe the data.
The parameters of this model have been previously
identified with sinusoidal motion stimuli over a broad
frequency range for similar tasks, but predictions concerning
perceptual thresholds for general motion profiles
are unknown. In our study, the thresholds obtained
from the three sinusoidal acceleration profiles
were used to identify the static gain of the model by
fitting the system gain to the inverted thresholds. The
other parameters were derived from the literature.
Predicting the thresholds for general motion profiles
was based on the assumption that the output of the
model can be interpreted as the signal intensity coming
from the otoliths and that if this intensity overcomes
a certain value the correct direction of motion can be
perceived. In order to predict the threshold, the peak
acceleration of the input profile must be selected so that
that the corresponding maximum model output is equal
to one (Fig. 1B).
Predictions for the remaining six non-sinusoidal profiles
showed that they were in good agreement with the
measured data, with the average error being smaller
than 20 of the average detection threshold. This is
a promising result, as just the static gain of the model
was identified from only three data points.
Accepting the linear model as a method to predict
thresholds, it is also possible to fit the model to the nonsinusoidal profile data and identify the whole parameter
set. Instead of fitting the system gain to the inverted
sinusoidal thresholds, we computed the predictions
for all profiles given a certain set of model parameters
and iteratively varied the parameters to minimize the
error between measurements and predictions. Two of
the three identified model parameters agreed with the
values given in the literature, while the third was found
to be different. This difference suggested a phase lead
for lower frequencies, which corresponds to sensitivity
to jerk (the time derivative of acceleration).
Comparing threshold predictions between models with
different jerk sensitivities reveal distinct differences between the predictions at low frequencies. The predictions
for a model with higher jerk sensitivity appear
more appropriate and could be tested in future experiments.
To summarize, we have shown that a linear model approach
is able to predict vestibular perceptual direction
detection thresholds. This allows the model parameters
to be identified while resorting to non-sinusoidal stimuli,
and helps to better understand vestibular linear motion perception. Future studies will extend these measurements
to lower frequencies and assess this modeling
approach for rotational movements.