English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

A flow-based latent state generative model of neural population responses to natural images

MPS-Authors
/persons/resource/persons252796

Jagadish,  AK
Research Group Computational Principles of Intelligence, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Bashiri, M., Walker, E., Lurz, K.-K., Jagadish, A., Muhammad, T., Ding, Z., et al. (2021). A flow-based latent state generative model of neural population responses to natural images. Poster presented at Bernstein Conference 2021. doi:10.12751/nncn.bc2021.p047.


Cite as: https://hdl.handle.net/21.11116/0000-0009-2A06-2
Abstract
Characterizing the activity of sensory neurons is a major goal of neural system identification. While neural responses in the visual cortex vary with visual stimuli, they also exhibit significant variability even to the repeated presentations of identical stimuli. This stimulus-conditioned variability has significant and sophisticated correlations commonly referred to as noise correlations and exhibits dependency on various factors such as the stimulus, the behavioral task, attention, and the general brain state. Understanding the nature of this correlated variability and its functional implication in the processing of sensory stimuli requires models that account for both stimulus-driven and shared stimulus-conditioned variability. However, existing models for these two major components of neural variability have been developed largely independently. Here, we close this gap and present a joint deep neural system identification model that accounts for both stimulus-driven and shared stimulus-conditioned variability. To this end, we combine (1) state-of-the-art deep networks for stimulus-driven activity and (2) a flexible, normalizing flow-based generative model to capture the stimulus-conditioned variability. We trained the model end-to-end using the activity of thousands of neurons from multiple areas of the mouse visual cortex in response to thousands of natural images. We show that our model outperforms previous state-of-the-art models in predicting distribution of neuronal population responses to individual natural images including changes in the form of the population response distribution as a function of the stimulus. Furthermore, it learns interesting latent factors of the population response including factors that capture behavioral variables such as pupil dilation, and other factors that vary systematically with brain area or retinotopic location. Overall, our model accurately accounts for two critical sources of neural variability while avoiding several complexities associated with many existing latent variable models. It thus provides a useful tool for uncovering the interplay between different factors that contribute to variability in neural activity.