4.7 Article

Orthogonal self-guided similarity preserving projection for classification and clustering

Journal

NEURAL NETWORKS
Volume 88, Issue -, Pages 1-8

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2017.01.001

Keywords

Dimensionality reduction; Intrinsic structure; Subspace clustering; Feature representation

Funding

  1. National Basic Research Program of China (973 Program) [2012CB316400]
  2. National Natural Science Foundation of China [61370163, 61332011]

Ask authors/readers for more resources

A suitable feature representation can faithfully preserve the intrinsic structure of data. However, traditional dimensionality reduction (DR) methods commonly use the original input features to define the intrinsic structure, which makes the estimated intrinsic structure unreliable since redundant or noisy features may exist in the original input features. Thus a dilemma is that (1) one needs the most suitable feature representation to define the intrinsic structure of data and (2) one should use the proper intrinsic structure of data to perform feature extraction. To address the problem, in this paper we propose a unified learning framework to simultaneously obtain the optimal feature representation and intrinsic structure of data. The structure is learned from the results of feature learning, and the features are learned to preserve the refined structure of data. By leveraging the interactions between the process of determining the most suitable feature representation and intrinsic structure of data, we can capture accurate structure and obtain the optimal feature representation of data. Experimental results demonstrate that our method outperforms state-of-the-art methods in DR and subspace clustering. The code of the proposed method is available at http://www.yongxu.org/lunwen.html''. (C) 2017 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available