Help Privacy Policy Disclaimer
  Advanced SearchBrowse




Journal Article

Multiplexed computations in retinal ganglion cells of a single type

There are no MPG-Authors in the publication available
External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available

Deny, S., Ferrari, U., Macé, E., Yger, P., Caplette, R., Picaud, S., et al. (2017). Multiplexed computations in retinal ganglion cells of a single type. Nature Communications, 8: 1964. doi:10.1038/s41467-017-02159-y.

Cite as: https://hdl.handle.net/21.11116/0000-0009-B0F2-E
In the early visual system, cells of the same type perform the same computation in different places of the visual field. How these cells code together a complex visual scene is unclear. A common assumption is that cells of a single-type extract a single-stimulus feature to form a feature map, but this has rarely been observed directly. Using large-scale recordings in the rat retina, we show that a homogeneous population of fast OFF ganglion cells simultaneously encodes two radically different features of a visual scene. Cells close to a moving object code quasilinearly for its position, while distant cells remain largely invariant to the object's position and, instead, respond nonlinearly to changes in the object's speed. We develop a quantitative model that accounts for this effect and identify a disinhibitory circuit that mediates it. Ganglion cells of a single type thus do not code for one, but two features simultaneously. This richer, flexible neural map might also be present in other sensory systems.