4.7 Article

Positive approximation: An accelerator for attribute reduction in rough set theory

Journal

ARTIFICIAL INTELLIGENCE
Volume 174, Issue 9-10, Pages 597-618

Publisher

ELSEVIER
DOI: 10.1016/j.artint.2010.04.018

Keywords

Rough set theory; Attribute reduction; Decision table; Positive approximation; Granular computing

Funding

  1. National Natural Science Foundation of China [60773133, 60903110, 70971080]
  2. National Key Basic Research and Development Program of China (973) [2007CB311002]
  3. Government of Hong Kong SAR [GRF: CityU 113308]
  4. National High Technology Research and Development Program of China [2007AA01Z165]
  5. Natural Science Foundation of Shanxi Province, China [2008011038, 2009021017-1]

Ask authors/readers for more resources

Feature selection is a challenging problem in areas such as pattern recognition, machine learning and data mining. Considering a consistency measure introduced in rough set theory, the problem of feature selection, also called attribute reduction, aims to retain the discriminatory power of original features. Many heuristic attribute reduction algorithms have been proposed however, quite often, these methods are computationally time-consuming. To overcome this shortcoming, we introduce a theoretic framework based on rough set theory, called positive approximation, which can be used to accelerate a heuristic process of attribute reduction. Based on the proposed accelerator, a general attribute reduction algorithm is designed. Through the use of the accelerator, several representative heuristic attribute reduction algorithms in rough set theory have been enhanced. Note that each of the modified algorithms can choose the same attribute reduct as its original version, and hence possesses the same classification accuracy. Experiments show that these modified algorithms outperform their original counterparts. It is worth noting that the performance of the modified algorithms becomes more visible when dealing with larger data sets. (C) 2010 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available