4.7 Article

Approximate information discriminant analysis: A computationally simple heteroscedastic feature extraction technique

Journal

PATTERN RECOGNITION
Volume 41, Issue 5, Pages 1548-1557

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2007.10.001

Keywords

feature extraction; information theory; mutual information; entropy; classification; linear discriminant analysis; Bayes error

Ask authors/readers for more resources

In this article we develop a novel linear dimensionality reduction technique for classification. The technique utilizes the first two statistical moments of data and retains the computational simplicity, characteristic of second-order techniques, such as linear discriminant analysis. Formally, the technique maximizes a criterion that belongs to the class of probability dependence measures, and is naturally defined for multiple classes. The criterion is based on an approximation of an information-theoretic measure and is capable of handling heteroscedastic data. The performance of our method, along with similar feature extraction approaches, is demonstrated based on experimental results with real-world datasets. Our method compares favorably to similar second-order linear dimensionality techniques. (c) 2007 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available