4.7 Article

Test-cost-sensitive attribute reduction

Journal

INFORMATION SCIENCES
Volume 181, Issue 22, Pages 4928-4942

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2011.07.010

Keywords

Cost-sensitive learning; Attribute reduction; Test cost; Heuristic algorithm

Funding

  1. National Natural Science Foundation of China [60873077, 60903110]

Ask authors/readers for more resources

In many data mining and machine learning applications, there are two objectives in the task of classification: one is decreasing the test cost, the other is improving the classification accuracy. Most existing research work focuses on the latter, with attribute reduction serving as an optional pre-processing stage to remove redundant attributes. In this paper, we point out that when tests must be undertaken in parallel, attribute reduction is mandatory in dealing with the former objective. With this in mind, we posit the minimal test cost reduct problem which constitutes a new, but more general, difficulty than the classical reduct problem. We also define three metrics to evaluate the performance of reduction algorithms from a statistical viewpoint. A framework for a heuristic algorithm is proposed to deal with the new problem: specifically, an information gain-based lambda-weighted reduction algorithm is designed, where weights are decided by test costs and a non-positive exponent lambda, which is the only parameter set by the user. The algorithm is tested with three representative test cost distributions on four UCI (University of California - Irvine) datasets. Experimental results show that there is a trade-off while setting lambda, and a competition approach can improve the quality of the result significantly. This study suggests potential application areas and new research trends concerning attribute reduction. (C) 2011 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available