非表示:
キーワード:
-
要旨:
The automatic segmentation and classification of an unknown motion data stream
accord-
ing to given motion categories constitutes an important research problem with
applica-
tions in computer animation, medicine and sports sciences. In this thesis, we
consider
the scenario of trampoline motions, where an athlete performs a sequence of
predefined
trampoline jumps. Here, each jump follows certain rules and belongs to a
specific motion
category such as a pike jump or a somersault. Then, the classification problem
consists
in automatically segmenting an unknown trampoline motion sequence into its
individ-
ual jumps and to classify these jumps according to the given motion categories.
Since
trampoline motions are very fast and spacious while requiring special lighting
conditions,
it is problematic to capture trampoline motions with video and optical motion
capture
systems. Inertial sensors that measure accelerations and orientations are more
suitable
for capturing trampoline motions and therefore have been used for this thesis.
However,
inertial sensor output is noisy and abstract requiring suitable feature
representations that
display the characteristics of each motion category without being sensitive to
noise and
performance variations. A sensor data stream can then be transformed into a
feature
sequence for classification. For every motion category, a class representation
(or in our
case, a class motion template) is learned from a class of example motions
performed by
different actors. The main idea, as employed in this thesis, is to locally
compare the fea-
ture sequence of the unknown trampoline motion with all given class motion
templates
using a variant of dynamic time warping (DTW) in the comparison. Then, the
unknown
motion stream is automatically segmented and locally classified by the class
template that
best explains the corresponding segment. Extensive experiments have been
conducted
on trampoline jumps from various athletes for evaluating various feature
representations,
segmentation and classification.