4.6 Article

Approximate Methods for State-Space Models

Journal

JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
Volume 105, Issue 489, Pages 170-180

Publisher

AMER STATISTICAL ASSOC
DOI: 10.1198/jasa.2009.tm08326

Keywords

Laplace's method; Neural decoding; Recursive Bayesian estimation

Funding

  1. [1201 MH064537]
  2. [RO1 EB005847]
  3. [RO1 NS050256]

Ask authors/readers for more resources

State-space models provide an important body of techniques for analyzing time series. but their use requires estimating Unobserved states The optimal estimate of the state Is its conditional expectation given the observation histories. and computing this expectation is hard when there are nonlinearities Existing filtering methods, including sequential Monte Carlo. tend to be either inaccurate or slow In this paper, we study a nonlinear filter for nonlinear/non-Gaussian state-space models. which uses Laplace's method. an asymptotic series expansion, to approximate the state's conditional mean and variance, together with a Gaussian conditional distribution This Laplace Gaussian fillet (LGE) gives fast. recursive, deterministic state estimates, with an error which is set by the stochastic characteristics of the model and is, we show, stable over time We illustrate the estimation ability of the LGF by applying it to the problem of neural decoding and compare it to sequential Monte Carlo both in simulations and with real data We find that the LGE can deliver superior results in a small fraction of the computing time This article has supplementary material online

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available