Journal
ARTIFICIAL INTELLIGENCE
Volume 151, Issue 1-2, Pages 155-176Publisher
ELSEVIER SCIENCE BV
DOI: 10.1016/S0004-3702(03)00079-1
Keywords
classification; feature selection; evaluation measures; search strategies; random search; branch and bound
Categories
Ask authors/readers for more resources
Feature selection is an effective technique in dealing with dimensionality reduction. For classification, it is used to find an optimal subset of relevant features such that the overall accuracy of classification is increased while the data size is reduced and the comprehensibility is improved. Feature selection methods contain two important aspects: evaluation of a candidate feature subset and search through the feature space. Existing algorithms adopt various measures to evaluate the Goodness of feature subsets. This work focuses on inconsistency measure according to which a feature subset is inconsistent if there exist at least two instances with same feature values but with different class labels. We compare inconsistency measure with other measures and study different search strategies such as exhaustive, complete, heuristic and random search, that can be applied to this measure. We conduct an empirical study to examine the pros and cons of these search methods, Give some guidelines on choosing a search method, and compare the classifier error rates before and after feature selection. (C) 2003 Elsevier B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available