期刊
INTERNATIONAL JOURNAL OF APPLIED PATTERN RECOGNITION
卷 3, 期 3, 页码 197-240出版社
INDERSCIENCE ENTERPRISES LTD
DOI: 10.1504/ijapr.2016.079733
关键词
principal component analysis; PCA; dimensionality reduction; feature extraction; covariance matrix; singular value decomposition; SVD; PCA space; biometrics; image compression
Dimensionality reduction is one of the preprocessing steps in many machine learning applications and it is used to transform the features into a lower dimension space. Principal component analysis (PCA) technique is one of the most famous unsupervised dimensionality reduction techniques. The goal of the technique is to find the PCA space, which represents the direction of the maximum variance of the given data. This paper highlights the basic background needed to understand and implement the PCA technique. This paper starts with basic definitions of the PCA technique and the algorithms of two methods of calculating PCA, namely, the covariance matrix and singular value decomposition (SVD) methods. Moreover, a number of numerical examples are illustrated to show how the PCA space is calculated in easy steps. Three experiments are conducted to show how to apply PCA in the real applications including biometrics, image compression, and visualisation of high-dimensional datasets.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据