English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Flexible statistical inference for mechanistic models of neural dynamics

Lueckmann, J.-M., Gonçalves, P. J., Bassetto, G., Oecal, K., Nonnenmacher, M., & Macke, J. H. (2018). Flexible statistical inference for mechanistic models of neural dynamics. In Advances in Neural Information Processing Systems 30 (NIPS 2017).

Item is

Basic

show hide
Genre: Conference Paper

Files

show Files
hide Files
:
Lückmann_Flexible.pdf (Any fulltext), 7MB
Name:
Lückmann_Flexible.pdf
Description:
-
OA-Status:
Visibility:
Public
MIME-Type / Checksum:
application/pdf / [MD5]
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show
hide
Description:
-
OA-Status:
Description:
a poster was presented additionally live at the conference (?)
OA-Status:
Locator:
https://papers.nips.cc/paper/2017 (Supplementary material)
Description:
-
OA-Status:

Creators

show
hide
 Creators:
Lueckmann, J-M1, Author           
Gonçalves, Pedro J.1, Author           
Bassetto, Giacomo1, Author           
Oecal, Kaan1, Author
Nonnenmacher, Marcel1, Author           
Macke, Jakob H.1, Author           
Affiliations:
1Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar), Max Planck Society, ou_2173683              

Content

show
hide
Free keywords: -
 Abstract: Mechanistic models of single-neuron dynamics have been extensively studied in computational neuroscience. However, identifying which models can quantitatively reproduce empirically measured data has been challenging. We propose to overcome this limitation by using likelihood-free inference approaches (also known as Approximate Bayesian Computation, ABC) to perform full Bayesian inference on single-neuron models. Our approach builds on recent advances in ABC by learning a neural network which maps features of the observed data to the posterior distribution over parameters. We learn a Bayesian mixture-density network approximating the posterior over multiple rounds of adaptively chosen simulations. Furthermore, we propose an efficient approach for handling missing features and parameter settings for which the simulator fails, as well as a strategy for automatically learning relevant features using recurrent neural networks. On synthetic data, our approach efficiently estimates posterior distributions and recovers ground-truth parameters. On in-vitro recordings of membrane voltages, we recover multivariate posteriors over biophysical parameters, which yield model-predicted voltage traces that accurately match empirical data. Our approach will enable neuroscientists to perform Bayesian inference on complex neuron models without having to design model-specific algorithms, closing the gap between mechanistic and statistical approaches to single-neuron modelling.

Details

show
hide
Language(s): eng - English
 Dates: 2018-01-01
 Publication Status: Published online
 Pages: 11
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: -
 Degree: -

Event

show
hide
Title: Neural Information Processing Systems (NIPS 2017)
Place of Event: Montréal, Canada
Start-/End Date: 2017-12-03 - 2017-12-08

Legal Case

show

Project information

show

Source 1

show
hide
Title: Advances in Neural Information Processing Systems 30 (NIPS 2017)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 30 Sequence Number: - Start / End Page: - Identifier: -