English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Quantitative predictions orchestrate visual signaling in Drosophila

MPS-Authors
There are no MPG-Authors in the publication available
External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Kim, A. J., Fenk, L. M., Lyu, C., & Maimon, G. (2017). Quantitative predictions orchestrate visual signaling in Drosophila. Cell, 168(1-2), 280-294. doi:10.1016/j.cell.2016.12.005.


Cite as: https://hdl.handle.net/21.11116/0000-0009-762F-F
Abstract
Vision influences behavior, but ongoing behavior also modulates vision in animals ranging from insects to primates. The function and biophysical mechanisms of most such modulations remain unresolved. Here, we combine behavioral genetics, electrophysiology, and high-speed videography to advance a function for behavioral modulations of visual processing in Drosophila. We argue that a set of motion-sensitive visual neurons regulate gaze-stabilizing head movements. We describe how, during flight turns, Drosophila perform a set of head movements that require silencing their gaze-stability reflexes along the primary rotation axis of the turn. Consistent with this behavioral requirement, we find pervasive motor-related inputs to the visual neurons, which quantitatively silence their predicted visual responses to rotations around the relevant axis while preserving sensitivity around other axes. This work proposes a function for a behavioral modulation of visual processing and illustrates how the brain can remove one sensory signal from a circuit carrying multiple related signals.