Journal
COGNITIVE COMPUTATION
Volume 6, Issue 3, Pages 608-621Publisher
SPRINGER
DOI: 10.1007/s12559-014-9252-5
Keywords
Sparse representation (coding); Classification; Feature extraction; Feature selection; Dimension reduction; Structure preserving
Funding
- National Science Foundation [ECCS 1053717]
- Army Research Office [W911NF-12-1-0378]
- NSF-DFG Collaborative Research on Autonomous Learning [CNS 1117314]
- Defense Advanced Research Projects Agency (DARPA) [FA8650-11-1-7152, FA8650-11-1-7148]
- Div Of Electrical, Commun & Cyber Sys
- Directorate For Engineering [1053717] Funding Source: National Science Foundation
Ask authors/readers for more resources
Sparse-representation-based classification (SRC), which classifies data based on the sparse reconstruction error, has been a new technique in pattern recognition. However, the computation cost for sparse coding is heavy in real applications. In this paper, various dimension reduction methods are studied in the context of SRC to improve classification accuracy as well as reduce computational cost. A feature extraction method, i.e., principal component analysis, and feature selection methods, i.e., Laplacian score and Pearson correlation coefficient, are applied to the data preparation step to preserve the structure of data in the lower-dimensional space. Classification performance of SRC with structure-preserving dimension reduction (SRC-SPDR) is compared to classical classifiers such as k-nearest neighbors and support vector machines. Experimental tests with the UCI and face data sets demonstrate that SRC-SPDR is effective with relatively low computation cost.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available