4.7 Article

Positive approximation: An accelerator for attribute reduction in rough set theory

期刊

ARTIFICIAL INTELLIGENCE
卷 174, 期 9-10, 页码 597-618

出版社

ELSEVIER
DOI: 10.1016/j.artint.2010.04.018

关键词

Rough set theory; Attribute reduction; Decision table; Positive approximation; Granular computing

资金

  1. National Natural Science Foundation of China [60773133, 60903110, 70971080]
  2. National Key Basic Research and Development Program of China (973) [2007CB311002]
  3. Government of Hong Kong SAR [GRF: CityU 113308]
  4. National High Technology Research and Development Program of China [2007AA01Z165]
  5. Natural Science Foundation of Shanxi Province, China [2008011038, 2009021017-1]

向作者/读者索取更多资源

Feature selection is a challenging problem in areas such as pattern recognition, machine learning and data mining. Considering a consistency measure introduced in rough set theory, the problem of feature selection, also called attribute reduction, aims to retain the discriminatory power of original features. Many heuristic attribute reduction algorithms have been proposed however, quite often, these methods are computationally time-consuming. To overcome this shortcoming, we introduce a theoretic framework based on rough set theory, called positive approximation, which can be used to accelerate a heuristic process of attribute reduction. Based on the proposed accelerator, a general attribute reduction algorithm is designed. Through the use of the accelerator, several representative heuristic attribute reduction algorithms in rough set theory have been enhanced. Note that each of the modified algorithms can choose the same attribute reduct as its original version, and hence possesses the same classification accuracy. Experiments show that these modified algorithms outperform their original counterparts. It is worth noting that the performance of the modified algorithms becomes more visible when dealing with larger data sets. (C) 2010 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据