English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

An open-source high-speed infrared videography database to study the principles of active sensing in freely navigating rodents

MPS-Authors

Miceli,  Stephanie
External Organizations;
Max Planck Research Group Neural Circuits, Center of Advanced European Studies and Research (caesar), Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Azarfar, A., Zhang, Y., Alishbayli, A., Miceli, S., Kepser, L., van der Wielen, D., et al. (2018). An open-source high-speed infrared videography database to study the principles of active sensing in freely navigating rodents. GigaScience, 7(12): giy134. doi:10.1093/gigascience/giy134.


Cite as: https://hdl.handle.net/21.11116/0000-0004-40DD-B
Abstract
Background

Active sensing is crucial for navigation. It is characterized by self-generated motor action controlling the accessibility and processing of sensory information. In rodents, active sensing is commonly studied in the whisker system. As rats and mice modulate their whisking contextually, they employ frequency and amplitude modulation. Understanding the development, mechanisms, and plasticity of adaptive motor control will require precise behavioral measurements of whisker position.
Findings

Advances in high-speed videography and analytical methods now permit collection and systematic analysis of large datasets. Here, we provide 6,642 videos as freely moving juvenile (third to fourth postnatal week) and adult rodents explore a stationary object on the gap-crossing task. The dataset includes sensory exploration with single- or multi-whiskers in wild-type animals, serotonin transporter knockout rats, rats received pharmacological intervention targeting serotonergic signaling. The dataset includes varying background illumination conditions and signal-to-noise ratios (SNRs), ranging from homogenous/high contrast to non-homogenous/low contrast. A subset of videos has been whisker and nose tracked and are provided as reference for image processing algorithms.
Conclusions

The recorded behavioral data can be directly used to study development of sensorimotor computation, top-down mechanisms that control sensory navigation and whisker position, and cross-species comparison of active sensing. It could also help to address contextual modulation of active sensing during touch-induced whisking in head-fixed vs freely behaving animals. Finally, it provides the necessary data for machine learning approaches for automated analysis of sensory and motion parameters across a wide variety of signal-to-noise ratios with accompanying human observer-determined ground-truth.