English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Virtual reality for animal navigation with camera-based optical flow tracking

MPS-Authors

Vishniakou,  Ivan
Max Planck Research Group Neural Circuits, Center of Advanced European Studies and Research (caesar), Max Planck Society;

/persons/resource/persons188171

Seelig,  Johannes D.
Max Planck Research Group Neural Circuits, Center of Advanced European Studies and Research (caesar), Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

1-s2.0-S0165027019302602-main.pdf
(Publisher version), 11MB

Supplementary Material (public)
There is no public supplementary material available
Citation

Vishniakou, I., Plöger, P. G., & Seelig, J. D. (2019). Virtual reality for animal navigation with camera-based optical flow tracking. Journal of Neuroscience Methods, 327: 108403. doi:10.1016/j.jneumeth.2019.108403.


Cite as: https://hdl.handle.net/21.11116/0000-0004-C704-7
Abstract
BACKGROUND: Virtual reality combined with a spherical treadmill is used across species for studying neural circuits underlying navigation and learning.
NEW METHOD: We developed an optical flow-based method for tracking treadmill ball motion in real time using a single high-resolution camera.
RESULTS: Tracking accuracy and timing were determined using calibration data. Ball tracking was performed at 500 Hz and integrated with an open source game engine for virtual reality projection. The projection was updated at 120 Hz with a latency with respect to ball motion of 30 ± 8 ms. The system was tested for behavior with fruit flies. The application and source code are available at https://github.com/ivan-vishniakou/neural-circuits-vr.
COMPARISON WITH EXISTING METHOD(S): Optical flow-based tracking of treadmill motion is typically achieved using optical mice. The camera-based optical flow tracking system developed here is based on off-the-shelf components and offers control over the image acquisition and processing parameters. This results in flexibility with respect to tracking conditions - such as ball surface texture, lighting conditions,
or ball size - as well as camera alignment and calibration.
CONCLUSIONS: A fast system for rotational ball motion tracking suitable
for virtual reality behavior with fruit flies was developed and characterized.