4.6 Article

Feature Selection for Hidden Markov Models and Hidden Semi-Markov Models

期刊

IEEE ACCESS
卷 4, 期 -, 页码 1642-1657

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2016.2552478

关键词

Feature selection; hidden Markov models; hidden semi-Markov models; maximum a posteriori estimation

向作者/读者索取更多资源

In this paper, a joint feature selection and parameter estimation algorithm is presented for hidden Markov models (HMMs) and hidden semi-Markov models (HSMMs). New parameters, feature saliencies, are introduced to the model and used to select features that distinguish between states. The feature saliencies represent the probability that a feature is relevant by distinguishing between state-dependent and state-independent distributions. An expectation maximization algorithm is used to calculate maximum a posteriori estimates for model parameters. An exponential prior on the feature saliencies is compared with a beta prior. These priors can be used to include cost in the model estimation and feature selection process. This algorithm is tested against maximum likelihood estimates and a variational Bayesian method. For the HMM, four formulations are compared on a synthetic data set generated by models with known parameters, a tool wear data set, and data collected during a painting process. For the HSMM, two formulations, maximum likelihood and maximum a posteriori, are tested on the latter two data sets, demonstrating that the feature saliency method of feature selection can be extended to semi-Markov processes. The literature on feature selection specifically for HMMs is sparse, and non-existent for HSMMs. This paper fills a gap in the literature concerning simultaneous feature selection and parameter estimation for HMMs using the EM algorithm, and introduces the notion of selecting features with respect to cost for HMMs.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据