English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Efficient Real-Time Video Stabilization for UAVs Using Only IMU Data

MPS-Authors
/persons/resource/persons192619

Odelga,  M
Project group: Autonomous Robotics & Human-Machine Systems, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons216476

Kochanek,  N
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Project group: Cybernetics Approach to Perception & Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator

Link
(Any fulltext)

Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Odelga, M., Kochanek, N., & Bülthoff, H. (2017). Efficient Real-Time Video Stabilization for UAVs Using Only IMU Data. In 4th Workshop on Research, Education and Development of Unmanned Aerial Systems (RED-UAS 2017) (pp. 210-215). Piscataway, NJ, USA: IEEE.


Cite as: http://hdl.handle.net/21.11116/0000-0000-C377-E
Abstract
While some unmanned aerial vehicles (UAVs) have the capacity to carry mechanically stabilized camera equipment, weight limits or other problems may make mechanical stabilization impractical. As a result many UAVs rely on fixed cameras to provide a video stream to an operator or observer. With a fixed camera, the video stream is often unsteady due to the multirotor's movement from wind and acceleration. These video streams are often analyzed by both humans and machines, and the unwanted camera movement can cause problems for both. For a human observer, unwanted movement may simply make it harder to follow the video, while for computer algorithms, it may severely impair the algorithm's intended function. There has been significant research on how to stabilize videos using feature tracking to determine camera movement, which in turn is used to manipulate frames and stabilize the camera stream. We believe, however, that this process could be greatly simplified by using data from a UAV's on-board inertial measurement unit (IMU) to stabilize the camera feed. In this paper we present an algorithm for video stabilization based only on IMU data from a UAV platform. Our results show that our algorithm successfully stabilizes the camera stream with the added benefit of requiring less computational power.