English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Amortised inference for mechanistic models of neural dynamics

MPS-Authors
/persons/resource/persons192805

Lueckmann,  J-M
Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar), Max Planck Society;

/persons/resource/persons258819

Gonçalves,  Pedro J.
Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar), Max Planck Society;

/persons/resource/persons192667

Bassetto,  Giacomo
Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar), Max Planck Society;

/persons/resource/persons84066

Macke,  Jakob H
Max Planck Research Group Neural Systems Analysis, Center of Advanced European Studies and Research (caesar), Max Planck Society;
External Organizations;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Lueckmann, J.-M., Gonçalves, P. J., Chintaluri, C., Podlaski, W. F., Bassetto, G., Vogels, T. P., et al. (2019). Amortised inference for mechanistic models of neural dynamics. Poster presented at Computational and Systems Neuroscience (Cosyne) 2019, Lisbon, Portugal.


Cite as: https://hdl.handle.net/21.11116/0000-0006-8ED8-7
Abstract
Bayesian statistical inference provides a principled framework for linking mechanistic models of neural dynamics with empirical measurements. However, for many models of interest, in particular those relying on numerical simulations, statistical inference is difficult and requires bespoke and expensive inference algorithms. Furthermore,
even within the same model class, each new measurement requires a full new inference – one can not leverage
knowledge from past inferences to facilitate new ones. This limits the use of Bayesian inference in time-critical,
large-scale, or fully-automated applications.
We overcome these limitations by presenting a method for statistical inference on simulation-based models which
can be applied in a ’black box’ manner to a wide range of models in neuroscience. The key idea is to generate a
large number of simulations from the model of interest and use them to train a neural network to perform statistical
inference. Once the network is trained, performing inference given any observed data is very fast, requiring only
a single-forward pass through the network, i.e. inference is amortised.
We explain how our approach can be used to perform parameter-estimation, and illustrate it in the context of
ion channel models. We train a network on a large diversity of simulated current responses to voltage-clamp
protocols. After training, the network is able to instantaneously provide the posterior distribution over the channel
model parameters given current responses from a publicly available database of ion channel models. The approach will enable neuroscientists to perform scalable Bayesian inference on large-scale data sets and complex
models without having to design model-specific algorithms, closing the gap between mechanistic and statistical approaches to neural dynamics.