4.5 Article

A sequential pruning strategy for the selection of the number of states in hidden Markov models

Journal

PATTERN RECOGNITION LETTERS
Volume 24, Issue 9-10, Pages 1395-1407

Publisher

ELSEVIER
DOI: 10.1016/S0167-8655(02)00380-X

Keywords

hidden Markov models; model selection; Bayesian inference criterion; minimum description length; state pruning

Ask authors/readers for more resources

This paper addresses the problem of the optimal selection of the structure of a hidden Markov model. A new approach is proposed, which is able to deal with drawbacks of standard general purpose methods, like those based on the Bayesian inference criterion, i.e., computational requirements, and sensitivity to initialization of the training procedures. The basic idea is to perform decreasing learning, starting each training session from a nearly good situation, derived from the result of the previous training session by pruning the least probable state of the model. Experiments with real and synthetic data show that the proposed approach is more accurate in finding the optimal model, is more effective in classification accuracy, while reducing the computational burden. (C) 2002 Elsevier Science B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available