4.6 Article

Accelerating the kernel-method-based feature extraction procedure from the viewpoint of numerical approximation

Journal

NEURAL COMPUTING & APPLICATIONS
Volume 20, Issue 7, Pages 1087-1096

Publisher

SPRINGER LONDON LTD
DOI: 10.1007/s00521-011-0534-5

Keywords

Pattern recognition; Kernel methods; Feature extraction; Kernel minimum squared error; Kernel PCA

Funding

  1. Program for New Century Excellent Talents in University [NCET-08-0156]
  2. National Nature Science Committee of China [61071179, 90820306, 60902099, 61001037]
  3. Fundamental Research Funds for the Central Universities [HIT.NSRIF.2009130]
  4. 863 Program Project [2007AA01Z195]
  5. HKSAR Government
  6. Hong Kong Polytechnic University

Ask authors/readers for more resources

The kernel method suffers from the following problem: the computational efficiency of the feature extraction procedure is inversely proportional to the size of the training sample set. In this paper, from a novel viewpoint, we propose a very simple and mathematically tractable method to produce the computationally efficient kernel-method-based feature extraction procedure. We first address the issue that how to make the feature extraction result of the reformulated kernel method well approximate that of the na < ve kernel method. We identify these training samples that statistically contribute much to the feature extraction results and exploit them to reformulate the kernel method to produce the computationally efficient kernel-method-based feature extraction procedure. Indeed, the proposed method has the following basic idea: when one training sample has little effect on the feature extraction result and statistically has the high correlation with regard to all the training samples, the feature extraction term associated with this training sample can be removed from the feature extraction procedure. The proposed method has the following advantages: First, it proposes, for the first time, to improve the kernel method through formal and reasonable evaluation on the feature extraction term. Second, the proposed method improves the kernel method at a low extra cost and thus has a much more computationally efficient training phase than most of the previous improvements to the kernel method. The experimental comparison shows that the proposed method performs well in classification problems. This paper also intuitively shows the geometrical relation between the identified training samples and other training samples.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available