4.6 Review

Bayesian methods for hidden Markov models: Recursive computing in the 21st century

期刊

出版社

AMER STATISTICAL ASSOC
DOI: 10.1198/016214502753479464

关键词

forward-backward recursion; Gibbs sampler; Kalman filter; local computation; Markov chain Monte Carlo

向作者/读者索取更多资源

Markov chain Monte Carlo (MCMC) sampling strategies can be used to simulate hidden Markov model (HMM) parameters from their posterior distribution given observed data, Some MCMC methods used in practice (for computing likelihood, conditional probabilities of hidden states, and the most likely sequence of states) can be improved by incorporating established recursive algorithms. The most important of these is a set of forward-backward recursions calculating conditional distributions of the hidden states given observed data and model parameters. I show how to use the recursive algorithms in an MCMC context and demonstrate mathematical and empirical results showing a Gibbs sampler using the forward-backward recursions mixes more rapidly than another sampler often used for HMMs. I introduce an augmented variables technique for obtaining unique state labels in HMMs and finite mixture models. I show how recursive computing allows the statistically efficient use of MCMC output when estimating the hidden states. I directly calculate the posterior distribution of the hidden chain's state-space size by MCMC, circumventing asymptotic arguments underlying the Bayesian information criterion, which is shown to be inappropriate for a frequently analyzed dataset in the HMM literature. The use of log-likelihood for assessing MCMC convergence is illustrated, and posterior predictive checks are used to investigate application specific questions of model adequacy.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据