English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Supervised learning sets benchmark for robust spike rate inference from calcium imaging signals

MPS-Authors
There are no MPG-Authors in the publication available
External Resource

Link
(Any fulltext)

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Bethge, M., Theis, L., Berens, P., Froudarakis, E., Reimer, J., Roman-Roson, M., et al. (2016). Supervised learning sets benchmark for robust spike rate inference from calcium imaging signals. Poster presented at Computational and Systems Neuroscience Meeting (COSYNE 2016), Salt Lake City, UT, USA.


Cite as: https://hdl.handle.net/21.11116/0000-0000-7BD1-A
Abstract
A fundamental challenge in calcium imaging has been to infer spike rates of neurons from the measured noisy calcium fluorescence traces. We collected a large benchmark dataset (>100.000 spikes, 73 neurons) recorded from varying neural tissue (V1 and retina) using different calcium indicators (OGB-1 and GCaMP6s). We introduce a new algorithm based on supervised learning in flexible probabilistic models and systematically compare it against a range of spike inference algorithms published previously. We show that our new supervised algorithm outperforms all previously published techniques. Importantly, it even performs better than other algorithms when applied to entirely new datasets for which no simultaneously recorded data is available. Future data acquired in new experimental conditions can easily be used to further improve its spike prediction accuracy and generalization performance. Finally, we show that comparing algorithms on artificial data is not informative about performance on real data, suggesting that benchmark datasets such as the one we provide may greatly facilitate future algorithmic developments.