English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Standardizing and benchmarking data analysis for calcium imaging

Berens, P., Theis, L., Stone, J., Sofroniew, N., Tolias, A., Bethge, M., et al. (2017). Standardizing and benchmarking data analysis for calcium imaging. Poster presented at Computational and Systems Neuroscience Meeting (COSYNE 2017), Salt Lake City, UT, USA.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/21.11116/0000-0000-C50B-6 Version Permalink: http://hdl.handle.net/21.11116/0000-0001-18B8-5
Genre: Poster

Files

show Files

Locators

show
hide
Locator:
Link (Any fulltext)
Description:
-

Creators

show
hide
 Creators:
Berens, P1, Author              
Theis, L, Author              
Stone, J, Author
Sofroniew, N, Author
Tolias, A, Author              
Bethge, M1, Author              
Freeman, J, Author
Affiliations:
1University of Tübingen, ou_persistent22              

Content

show
hide
Free keywords: -
 Abstract: Two-photon laser scanning microscopy with fluorescent calcium indicators is used widely to measure the activity of large populations of neurons. Extracting biologically relevant signals of interest without manual intervention remains a challenge. Two key problems are identifying image regions corresponding to individual neurons, and then detecting the timing of individual spikes from their derived fluorescence traces. The neuroscience community still lacks automated and agreed-upon solutions to these problems. Motivated by algorithm benchmarking efforts in computer vision and machine learning, we built two web-based benchmarking systems, Neurofinder (http://neurofinder.codeneuro.org) and Spikefinder (http://spikefinder.codeneuro.org), to compare algorithm performance on standardized datasets. Both were built with modular and modern open-source tools, allowing easy reuse for other data analysis problems. Neurofinder considers the problem of identifying neuron somata in fluorescence movies. We assembled a collection of training datasets from multiple labs in a standardized format, each with labeled regions defined manually, in some cases guided by activity-independent anatomical markers. Algorithm results are submitted through a web application and evaluated on independent test data, for which labels have not been made public. Evaluation metrics separately assess accuracy of neuronal locations and shapes. Submitted results are stored in a database and metrics are presented in a leaderboard. Spikefinder considers the problem of detecting spike times from fluorescence traces, building on a recent quantitative comparison of existing spike inference algorithms (Theis et al. 2016). Here, we assembled training data with simultaneously measured calcium traces and electrophysiologically-recorded action potentials. Performance of submitted algorithms is evaluated on a test dataset using several metrics including correlation, information gain, and standard measures from signal detection. Both challenges are currently running with publicly contributed algorithms. We hope this approach will both improve our understanding of how current algorithms perform, and generate new crowd-sourced solutions to current and future analysis problems.

Details

show
hide
Language(s):
 Dates: 2017-02
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: -
 Identifiers: BibTex Citekey: BerensTSSTBF2017
 Degree: -

Event

show
hide
Title: Computational and Systems Neuroscience Meeting (COSYNE 2017)
Place of Event: Salt Lake City, UT, USA
Start-/End Date: -

Legal Case

show

Project information

show

Source 1

show
hide
Title: Computational and Systems Neuroscience Meeting (COSYNE 2017)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: - Sequence Number: I-32 Start / End Page: 66 - 67 Identifier: -