期刊
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS
卷 30, 期 1, 页码 96-109出版社
ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1016/j.acha.2010.04.001
关键词
Indefinite kernel; Mercer kernel; Coefficient regularization; Least square regression; Integral operator; Capacity independent error bounds; Learning rates
资金
- Nature Science Fund of Shandong Province, China [Y2007A11]
- University of Jinan [XBS0832]
In this paper, we provide a mathematical foundation for the least square regression learning with indefinite kernel and coefficient regularization. Except for continuity and boundedness, the kernel function is not necessary to satisfy any further regularity conditions. An explicit expression of the solution via sampling operator and empirical integral operator is derived and plays an important role in our analysis. It provides a natural error decomposition where the approximation error is characterized by a reproducing kernel Hilbert space associated to certain Mercer kernel. A careful analysis shows the sample error has O(1/root m) decay. We deduce the error bound and prove the asymptotic convergence. Satisfactory learning rates are then derived under a very mild regularity condition on the regression function. When the kernel is itself a Mercer kernel better rates are given by a rigorous analysis which shows coefficient regularization is powerful in learning smooth functions. The saturation effect and the relation to the spectral algorithms are discussed. (C) 2010 Elsevier Inc. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据