Journal
PATTERN RECOGNITION
Volume 48, Issue 6, Pages 1999-2012Publisher
ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2014.12.012
Keywords
Incremental linear discriminant analysis; Accelerated algorithm; Steepest descent method; Conjugate direction method; Feature extraction
Ask authors/readers for more resources
Linear discriminant analysis (LDA) is a traditional statistical technique that reduces dimensionality while preserving as much of the class discriminatory information as possible. The conventional form of the LDA assumes that all the data are available in advance and the LDA feature space is computed by finding the eigendecomposition of an appropriate matrix. However, there are situations where the data are presented in a sequence and the LDA features are required to be updated incrementally by observing the new incoming samples. Chatterjee and Roychowdhury proposed an algorithm for incrementally computing the LDA features followed by Moghaddam et al. who accelerated the convergence rate of these algorithms. The proposed algorithms by Moghaddam et al. are derived by applying the chain rule on an implicit cost function. Since the authors have not had access to the cost function they could not analyze the convergence of the proposed algorithms and the convergence of the proposed accelerated' techniques were not guaranteed. In this paper, we briefly review the previously proposed algorithms, then we derive new algorithms to accelerate the convergence rate of the incremental LDA algorithm given by Chatterjee and Roychowdhury. The proposed algorithms are derived by optimizing the step size in each iteration using steepest descent and conjugate direction methods. We test the performance of the proposed algorithms for incremental LDA on synthetic and real data sets. The simulation results confirm that the proposed algorithms estimate the LDA features faster than the gradient descent based algorithm presented by Moghaddam et al., and the algorithm proposed by Chatterjee and Roychowdhury. (C) 2014 Elsevier Ltd. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available