Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Markerless tracking of user-defined anatomical features with deep learning

MPG-Autoren
/persons/resource/persons83805

Bethge,  M
Research Group Computational Vision and Neuroscience, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Mathis, A., Mamidanna, P., Abe, T., Cury, K., Murthy, V., Mathis, M., et al. (2018). Markerless tracking of user-defined anatomical features with deep learning. Poster presented at CSF Conference: Hand, Brain and Technology: The Somatosensory System (HBT 2018), Monte Verità, Switzerland.


Zitierlink: https://hdl.handle.net/21.11116/0000-0002-B7C7-F
Zusammenfassung
Quantifying behavior is crucial for many applications in neuroscience. Videography provides easy methods for the observation and recording of animal behavior in diverse settings, yet extracting particular aspects of a behavior for further analysis can be highly time consuming. In motor control studies, humans or other animals are often marked with reflective markers to assist with computer-based
tracking, yet markers are intrusive (especially for smaller animals), and the number and location of the markers must be determined a priori. We present a highly efficient method for markerless tracking based on transfer learning with deep neural networks that achieves excellent results with minimal training data. We demonstrate the versatility of this framework by tracking various body parts in a broad collection of experimental settings: mice odor trail-tracking, egg-laying behavior in drosophila, and mouse hand
articulation in a skilled forelimb task. For example, during the skilled reaching behavior, individual joints
can be automatically tracked (and a confidence score is reported). Remarkably, even when a small number of frames are labeled, the algorithm achieves excellent tracking performance on test frames that is comparable to human accuracy.