4.6 Article

Feature vector selection and projection using kernels

期刊

NEUROCOMPUTING
卷 55, 期 1-2, 页码 21-38

出版社

ELSEVIER SCIENCE BV
DOI: 10.1016/S0925-2312(03)00429-6

关键词

kernel methods; feature space; data selection; principal component analysis; discriminant analysis

向作者/读者索取更多资源

This paper provides new insight into kernel methods by using data selection. The kernel trick is used to select from the data a relevant subset forming a basis in a feature space F. Thus the selected vectors define a subspace in F. Then, the data is projected onto this subspace where classical algorithms are applied. We show that kernel methods like generalized discriminant analysis (Neural Comput. 12 (2000) 2385) or kernel principal component analysis (Neural Comput. 10 (1998) 1299) can be expressed more easily. Moreover, it will turn out that the size of the basis is related to the complexity of the model. Therefore, the data selection leads to a complexity control and thus to a better generalization. The approach covers a wide range of algorithms. We investigate the function approximation on real classification problems and on a regression problem. (C) 2003 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据