English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Amortized inference for fast spike prediction from calcium imaging data

Speiser, A., Turaga, S., Archer, E., & Macke, J. (2017). Amortized inference for fast spike prediction from calcium imaging data. Poster presented at Computational and Systems Neuroscience Meeting (COSYNE 2017), Salt Lake City, UT, USA.

Item is

Files

show Files

Locators

show
hide
Locator:
Link (Any fulltext)
Description:
-
OA-Status:

Creators

show
hide
 Creators:
Speiser, A1, Author
Turaga, S, Author
Archer, E, Author           
Macke, J1, Author           
Affiliations:
1Center of Advanced European Studies and Research (caesar), Max Planck Society, ou_2173675              

Content

show
hide
Free keywords: -
 Abstract: Calcium imaging allows neuronal activity measurements from large populations of spatially identified neurons invivo. However, spike inference algorithms are needed to infer spike times from fluorescence measurements of calcium concentration. Bayesian model inversion can be used to infer spikes, using carefully designed generative models that describe how spiking activity in a neuron influences measured fluorescence. Model inversion typically requires either computationally expensive MCMC sampling methods, or faster but approximate maximuma-posteriori estimation. We present a method for efficiently inverting generative models for spike inference. Our method is several orders of magnitude faster than existing approaches, allowing for generative-model based spike inference in real-time for large-scale population neural imaging, and can be applied to a wide range of linear and nonlinear generative models. We use recent advances in black-box variational inference (BBVI, Ranganath 2014) and ‘amortize’ inference by learning a deep network based recognition-model for fast model inversion (Mnih 2016). At training time, we simultaneously optimize the parameters of the generative model as well as the weights of a deep neural network which predicts the posterior approximation. At test time, performing inference for a given trace amounts to a fast single forward pass through the network at constant computational cost, and without the need for iterative optimization or MCMC sampling. On simple synthetic datasets, we show that our method is just as accurate as existing methods. However, the BBVI approach works with a wide range of generative models in a black-box manner as long as they are differentiable. In particular, we show that using a nonlinear generative model is better suited to describe GCaMP6 data (Chen 2013), leading to improved performance on real data. The framework can also easily be extended to combine supervised and unsupervised objectives enabling semi-supervised learning of spike inference.

Details

show
hide
Language(s):
 Dates: 2017-02
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: BibTex Citekey: SpeiserTAM2017
 Degree: -

Event

show
hide
Title: Computational and Systems Neuroscience Meeting (COSYNE 2017)
Place of Event: Salt Lake City, UT, USA
Start-/End Date: -

Legal Case

show

Project information

show

Source 1

show
hide
Title: Computational and Systems Neuroscience Meeting (COSYNE 2017)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: - Sequence Number: III-59 Start / End Page: 207 - 208 Identifier: -