4.6 Article

Robust Locally Discriminant Analysis via Capped Norm

Journal

IEEE ACCESS
Volume 7, Issue -, Pages 4641-4652

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2018.2885131

Keywords

Feature extraction; capped L-2-norm loss; L-2,L- 1-regularization; manifold learning; discriminant analysis

Funding

  1. Natural Science Foundation of China [61573248, 61802267, 61773328, 61732011, 61703283]
  2. Guangdong Natural Science Foundation [2017A030313367, 2017A030310067]
  3. Shenzhen Municipal Science and Technology Innovation Council [JCYJ20170302153434048, JCYJ20160429182058044]

Ask authors/readers for more resources

Conventional linear discriminant analysis and its extended versions have some potential drawbacks. First, they are sensitive to outliers, noise, and variations in data, which degrades their performances in dimensionality reduction. Second, most of the linear discriminant analysis-based methods only focus on the global structures of data but ignore their local geometric structures, which play important roles in dimensionality reduction. More importantly, the total number of projections obtained by linear discriminant analysis (LDA) based methods are limited by the class number in the training data set. To solve the problems mentioned above, we propose a novel method called robust locally discriminant analysis via capped norm (RLDA), in this paper. By replacing L-2-norm with L-2,L-1-norm to construct the robust between-class scatter matrix and using the capped norm to further reduce the negative impact of outliers in constructing the within-class scatter matrix, we can guarantee the robustness of the proposed methods. In addition, we also impose L-2,L- 1-norm regularized term on projection matrix, so that its joint sparsity can be ensured. Since we redefine the scatter matrices in traditional LDA, the projection numbers we obtain are no longer restricted by the class numbers. The experimental results show the superior performance of RLDA to other compared dimensionality reduction methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available