4.5 Article

Joint sparse representation and locality preserving projection for feature extraction

Journal

Publisher

SPRINGER HEIDELBERG
DOI: 10.1007/s13042-018-0849-y

Keywords

Sparse representation; Locality preserving projection; Feature extraction; Dimensionality reduction

Funding

  1. National Natural Science Foundation of China [61702110, 61603100, 61772141]
  2. Guangdong Provincial Natural Science Foundation [17ZK0422]
  3. Guangzhou Science and Technology Project [201508010067, 201604020145, 2016201604030034, 201604046017, 201804010347]
  4. [2015[133]]
  5. [2014[97]]

Ask authors/readers for more resources

Traditional graph-based feature extraction methods use two separated procedures, i.e., graph learning and projection learning to perform feature extraction. They make the feature extraction result highly dependent on the quality of the initial fixed graph, while the graph may not be the optimal one for feature extraction. In this paper, we propose a novel unsupervised feature extraction method, i.e., joint sparse representation and locality preserving projection (JSRLPP), in which the graph construction and feature extraction are simultaneously carried out. Specifically, we adaptively learn the similarity matrix by sparse representation, and at the same time, learn the projection matrix by preserving local structure. Compared with traditional feature extraction methods, our approach unifies graph learning and projection learning to a common framework, thus learns a more suitable graph for feature extraction. Experiments on several public image data sets demonstrate the effectiveness of our proposed algorithm.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available