期刊
ARTIFICIAL INTELLIGENCE
卷 151, 期 1-2, 页码 155-176出版社
ELSEVIER SCIENCE BV
DOI: 10.1016/S0004-3702(03)00079-1
关键词
classification; feature selection; evaluation measures; search strategies; random search; branch and bound
Feature selection is an effective technique in dealing with dimensionality reduction. For classification, it is used to find an optimal subset of relevant features such that the overall accuracy of classification is increased while the data size is reduced and the comprehensibility is improved. Feature selection methods contain two important aspects: evaluation of a candidate feature subset and search through the feature space. Existing algorithms adopt various measures to evaluate the Goodness of feature subsets. This work focuses on inconsistency measure according to which a feature subset is inconsistent if there exist at least two instances with same feature values but with different class labels. We compare inconsistency measure with other measures and study different search strategies such as exhaustive, complete, heuristic and random search, that can be applied to this measure. We conduct an empirical study to examine the pros and cons of these search methods, Give some guidelines on choosing a search method, and compare the classifier error rates before and after feature selection. (C) 2003 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据