English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Learning stable, regularised latent models of neural population dynamics

Büsing, L., Macke, J. H., & Sahani, M. (2012). Learning stable, regularised latent models of neural population dynamics. Network, 23(1-2), 24-47. doi:10.3109/0954898X.2012.677095.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/11858/00-001M-0000-0028-64DE-F Version Permalink: http://hdl.handle.net/11858/00-001M-0000-0028-64DF-D
Genre: Journal Article

Files

show Files

Locators

show
hide
Description:
-
Description:
-

Creators

show
hide
 Creators:
Büsing, L., Author
Macke, J. H.1, Author
Sahani, M., Author
Affiliations:
1External Organizations, ou_persistent22              

Content

show
hide
Free keywords: Algorithms Animals *Artificial Intelligence Computer Simulation Data Interpretation, Statistical Electrodes, Implanted Likelihood Functions Linear Models Macaca mulatta Models, Neurological Motor Cortex/physiology Nerve Net/physiology *Neural Networks (Computer) Normal Distribution Population Dynamics User-Computer Interface
 Abstract: Ongoing advances in experimental technique are making commonplace simultaneous recordings of the activity of tens to hundreds of cortical neurons at high temporal resolution. Latent population models, including Gaussian-process factor analysis and hidden linear dynamical system (LDS) models, have proven effective at capturing the statistical structure of such data sets. They can be estimated efficiently, yield useful visualisations of population activity, and are also integral building-blocks of decoding algorithms for brain-machine interfaces (BMI). One practical challenge, particularly to LDS models, is that when parameters are learned using realistic volumes of data the resulting models often fail to reflect the true temporal continuity of the dynamics; and indeed may describe a biologically-implausible unstable population dynamic that is, it may predict neural activity that grows without bound. We propose a method for learning LDS models based on expectation maximisation that constrains parameters to yield stable systems and at the same time promotes capture of temporal structure by appropriate regularisation. We show that when only little training data is available our method yields LDS parameter estimates which provide a substantially better statistical description of the data than alternatives, whilst guaranteeing stable dynamics. We demonstrate our methods using both synthetic data and extracellular multi-electrode recordings from motor cortex.

Details

show
hide
Language(s):
 Dates: 2012
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: -
 Identifiers: Other: 22663075
DOI: 10.3109/0954898X.2012.677095
ISSN: 1361-6536 (Electronic)
ISSN: 0954-898X (Linking)
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Network
  Alternative Title : Network
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 23 (1-2) Sequence Number: - Start / End Page: 24 - 47 Identifier: -