Journal
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS
Volume 11, Issue 10, Pages 2247-2260Publisher
SPRINGER HEIDELBERG
DOI: 10.1007/s13042-020-01113-7
Keywords
Feature selection; Low-rank representation; Image classification; Small-class problem; Subspace learning
Categories
Funding
- Natural Science Foundation of China [61976145, 61802267, 61732011]
- Shenzhen Municipal Science and Technology Innovation Council [JCYJ20180305124834854, JCYJ20160429182058044]
- Natural Science Foundation of Guangdong Province [2017A030313367]
Ask authors/readers for more resources
The robustness to outliers, noises, and corruptions has been paid more attention recently to increase the performance in linear feature extraction and image classification. As one of the most effective subspace learning methods, low-rank representation (LRR) can improve the robustness of an algorithm by exploring the global representative structure information among the samples. However, the traditional LRR cannot project the training samples into low-dimensional subspace with supervised information. Thus, in this paper, we integrate the properties of LRR with supervised dimensionality reduction techniques to obtain optimal low-rank subspace and discriminative projection at the same time. To achieve this goal, we proposed a novel model named Discriminative Low-Rank Projection (DLRP). Furthermore, DLRP can break the limitation of the small class problem which means the number of projections is bound by the number of classes. Our model can be solved by alternatively linearized alternating direction method with adaptive penalty and the singular value decomposition. Besides, the analyses of differences between DLRP and previous related models are shown. Extensive experiments conducted on various contaminated databases have confirmed the superiority of the proposed method.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available