4.6 Article

Feature vector selection and projection using kernels

Journal

NEUROCOMPUTING
Volume 55, Issue 1-2, Pages 21-38

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/S0925-2312(03)00429-6

Keywords

kernel methods; feature space; data selection; principal component analysis; discriminant analysis

Ask authors/readers for more resources

This paper provides new insight into kernel methods by using data selection. The kernel trick is used to select from the data a relevant subset forming a basis in a feature space F. Thus the selected vectors define a subspace in F. Then, the data is projected onto this subspace where classical algorithms are applied. We show that kernel methods like generalized discriminant analysis (Neural Comput. 12 (2000) 2385) or kernel principal component analysis (Neural Comput. 10 (1998) 1299) can be expressed more easily. Moreover, it will turn out that the size of the basis is related to the complexity of the model. Therefore, the data selection leads to a complexity control and thus to a better generalization. The approach covers a wide range of algorithms. We investigate the function approximation on real classification problems and on a regression problem. (C) 2003 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available