4.7 Article

Locality-Preserving Discriminant Analysis in Kernel-Induced Feature Spaces for Hyperspectral Image Classification

期刊

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LGRS.2011.2128854

关键词

Dimensionality reduction; feature space; hyperspectral imagery (HSI); kernel methods

资金

  1. National Science Foundation [CCF0915307]
  2. National Geospatial-Intelligence Agency [HM1582-10-1-0001]

向作者/读者索取更多资源

Linear discriminant analysis (LDA) has been widely applied for hyperspectral image (HSI) analysis as a popular method for feature extraction and dimensionality reduction. Linear methods such as LDA work well for unimodal Gaussian class-conditional distributions. However, when data samples between classes are nonlinearly separated in the input space, linear methods such as LDA are expected to fail. The kernel discriminant analysis (KDA) attempts to address this issue by mapping data in the input space onto a subspace such that Fisher's ratio in an intermediate (higher-dimensional) kernel-induced space is maximized. In recent studies with HSI data, KDA has been shown to outperform LDA, particularly when the data distributions are non-Gaussian and multimodal, such as when pixels represent target classes severely mixed with background classes. In this letter, a modified KDA algorithm, i.e., kernel local Fisher discriminant analysis (KLFDA), is studied for HSI analysis. Unlike KDA, KLFDA imposes an additional constraint on the mapping-it ensures that neighboring points in the input space stay close-by in the projected subspace and vice versa. Classification experiments with a challenging HSI task demonstrate that this approach outperforms current state-of-the-art HSI-classification methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据