期刊
PATTERN RECOGNITION
卷 66, 期 -, 页码 328-341出版社
ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2016.08.020
关键词
Dimensionality reduction; Linear discriminant analysis (LDA); Exponential discriminant analysis (EDA); Matrix exponential; Krylov subspace
资金
- National Science Foundation of China [11371176]
- Natural Science Foundation of Jiangsu Province [BK20131126]
- Talent Introduction Program of China University of Mining and Technology
Exponential discriminant analysis (EDA) is a generalized discriminant analysis method based on matrix exponential. It can essentially overcome the intrinsic difficulty of small-sample-size problem that exists in the classical linear discriminant analysis (LDA). However, for data with high dimensionality, one has to solve a large matrix exponential eigenproblem in this method, and the time complexity is dominated by the computation of exponential of large matrices. In this paper, we propose two inexact Krylov subspace algorithms for solving the large matrix exponential eigenproblem efficiently. The contribution of this work is threefold. First, we consider how to compute matrix exponential-vector products efficiently, which is the key step in the Krylov subspace method. Second, we compare the discriminant analysis criterion of EDA and that of LDA from a theoretical point of view. Third, we establish the relationship between the accuracy of the approximate eigenvectors and the distance to nearest neighbor classifier, and show why the matrix exponential eigenproblem can be solved approximately in practice. Numerical experiments on some real-world databases show the superiority of our new algorithms over their original counterpart for face recognition.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据