4.7 Article

Combining neighborhood separable subspaces for classification via sparsity regularized optimization

Journal

INFORMATION SCIENCES
Volume 370, Issue -, Pages 270-287

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2016.08.004

Keywords

Ensemble learning; Attribute reduction; Neighborhood rough sets; Joint representation; Group sparsity

Funding

  1. National Program on Key Basic Research Project [2013CB329304]
  2. National Natural Science Foundation of China [61502332, 61432011, 61222210]

Ask authors/readers for more resources

The neighborhood rough set theory has been successfully applied to various classification tasks. The key concept of this theory is to find a sufficient and necessary neighborhood separable subspace for building a compact model. Given a classification learning task, there usually exist numerous neighborhood separable subspaces that maintain the discriminative ability of the original space with respect to a given granularity. These subspaces contain complementary information for classification. However, it is a challenging task to compute these subspaces efficiently. In this paper, we develop a fast neighborhood attribute reduction algorithm based on sample pair selection to find all reducts. Nevertheless, it cannot deal with large-scale data. Then we propose a randomized attribute reduction algorithm based on neighborhood dependency. The randomized algorithm can find a part of all reducts and is very efficient. A classification framework of joint subspace representation is proposed to fully exploit the complementary information in different subspaces. In addition, a weight matrix is learned to combine the representation residuals in the different subspaces via group sparsity regularization. The performances of the proposed attribute reduction algorithms are compared, and the influence of granularity on attribute reduction is discussed. Finally, the proposed technique is compared with other ensemble learning algorithms. Experimental results show that the proposed framework is superior to state-of-the-art classifiers. (C) 2016 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available