4.6 Article

Kernel online learning with adaptive kernel width

Journal

NEUROCOMPUTING
Volume 175, Issue -, Pages 233-242

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2015.10.055

Keywords

Online learning; Kernel width; Adaptive learning; Cumulative coherence; Convergence

Ask authors/readers for more resources

This paper discusses a unified framework for kernel online learning (KOL) algorithm with adaptive kernels. Unlike the traditional KOL algorithms which applied a fixed kernel width in the training process, the kernel width is considered as an additional free parameter and can be adapted automatically. A robust training method is proposed based on an adaptive dead zone scheme. The kernel weight and the kernel width are updated under a unified framework, where they share the same learning parameters. We present a theoretical convergence analysis of the proposed adaptive training method which can switch off the learning when the training error is too small in terms of external disturbance. Meanwhile, in the regularization of the kernel function number, an in-depth measure concept: the cumulative coherence is applied. A dictionary with predefined size is selected by online minimization of its cumulative coherence without using any parameters related to the prior knowledge of the training samples. Simulation results show that the proposed algorithm can adapt the training data effectively with different initial kernel width. Its performance could be better in both testing accuracy and convergence speed compared with the kernel algorithms with a fixed kernel width. (C) 2015 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available