Journal
PATTERN RECOGNITION
Volume 133, Issue -, Pages -Publisher
ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2022.108976
Keywords
Optimal scoring; Linear discriminant analysis; Feature selection; l(q)-norm; Sparseness
Ask authors/readers for more resources
This paper proposes a unified model based on the generalized l(q)-norm to address the challenge of optimal scoring on small sample size datasets, and develops an efficient alternative direction method of multipliers to handle the difficulties in dealing with the generalized norm. Numerical experiments demonstrate the effectiveness and feasibility of the proposed method.
Optimal scoring (OS), an equivalent form of linear discriminant analysis (LDA), is an important supervised learning method and dimensionality reduction tool. However, it is still a challenge for the classical OS on small sample size (SSS) datasets. In this paper, to find sparse discriminant vectors, we propose a unified model for sparse optimal scoring (SOS) by virtue of the generalized l(q)-norm ( 0 <= q <= 1). To overcome the difficulty in treating the generalized l(q)-norm, we propose an efficient alternative direction method of multipliers (ADMM), where proximity operator of l(q)-norm is employed for different q values. Mean-while, the convergence results of our method are also established. Numerical experiments on artificial and benchmark datasets demonstrate the effectiveness and feasibility of our proposed method. (C) 2022 Elsevier Ltd. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available