4.6 Article

Least square regression with indefinite kernels and coefficient regularization

期刊

出版社

ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1016/j.acha.2010.04.001

关键词

Indefinite kernel; Mercer kernel; Coefficient regularization; Least square regression; Integral operator; Capacity independent error bounds; Learning rates

资金

  1. Nature Science Fund of Shandong Province, China [Y2007A11]
  2. University of Jinan [XBS0832]

向作者/读者索取更多资源

In this paper, we provide a mathematical foundation for the least square regression learning with indefinite kernel and coefficient regularization. Except for continuity and boundedness, the kernel function is not necessary to satisfy any further regularity conditions. An explicit expression of the solution via sampling operator and empirical integral operator is derived and plays an important role in our analysis. It provides a natural error decomposition where the approximation error is characterized by a reproducing kernel Hilbert space associated to certain Mercer kernel. A careful analysis shows the sample error has O(1/root m) decay. We deduce the error bound and prove the asymptotic convergence. Satisfactory learning rates are then derived under a very mild regularity condition on the regression function. When the kernel is itself a Mercer kernel better rates are given by a rigorous analysis which shows coefficient regularization is powerful in learning smooth functions. The saturation effect and the relation to the spectral algorithms are discussed. (C) 2010 Elsevier Inc. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据