4.6 Article

Neural network based class-conditional probability density function using kernel trick for supervised classifier

期刊

NEUROCOMPUTING
卷 154, 期 -, 页码 225-229

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2014.11.070

关键词

Back propagation neural network; Bayes classifier; Kernel trick; Kernel-Linear discriminant analysis; Higher dimensional space

向作者/读者索取更多资源

The practical limitation of the Bayes classifier used in pattern recognition is computing the class-conditional probability density function (pdf) of the vectors belonging to the corresponding classes. In this paper, a neural network based approach is proposed to model the class-conditional pdf, which can be further used in supervised classifier (e.g. Bayes classifier). It is also suggested to use kernel version (using kernel trick) of the proposed approach to obtain the class-conditional pdf of the corresponding training vectors in the higher dimensional space. This is used for better class separation and hence better classification rate is achieved. The performance of the proposed technique is validated by using the synthetic data and the real data. The simulation results show that the proposed technique on synthetic data and the real data performs well (in terms of classification accuracy) when compared with the classical Fisher's Linear Discriminant Analysis (LDA) and Gaussian based Kernel-LDA. (C) 2014 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据