English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Markerless tracking of user-defined anatomical features with deep learning

MPS-Authors
/persons/resource/persons83805

Bethge,  M
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Computational Vision and Neuroscience, Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource

Link
(Abstract)

Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Mathis, M., Mathis, A., Mamidanna, P., Abe, T., Murthy, V., & Bethge, M. (2018). Markerless tracking of user-defined anatomical features with deep learning. Poster presented at 28th Annual Meeting of the Society for the Neural Control of Movement (NCM 2018), Santa Fe, NM, USA.


Cite as: http://hdl.handle.net/21.11116/0000-0001-7DF0-4
Abstract
Quantifying behavior is crucial for many applications in neuroscience. Videography provides easy methods to observe animals, yet extracting particular aspects of a behavior can be highly time consuming. In motor control studies, humans or other animals are often marked with reflective markers to assist with computer-based tracking, yet markers are intrusive, especially for smaller animals, and the number and location of the markers must be determined a priori. Here we provide a highly efficient method of markerless tracking in mice based on transfer learning with very few training samples (~ 200 frames). We demonstrate the versatility of this framework by tracking various body parts of mice in different tasks: odor trail-tracking (by one or multiple mice simultaneously), and a skilled forelimb reach and pull task. For example, during the skilled reaching behavior, individual digit joints can be automatically tracked from the hand. Remarkably, even when a small number of frames are labeled, the algorithm achieves excellent tracking performance on test frames that is comparable to human accuracy.