4.5 Article

Learning rates of least-square regularized regression

Journal

FOUNDATIONS OF COMPUTATIONAL MATHEMATICS
Volume 6, Issue 2, Pages 171-192

Publisher

SPRINGER
DOI: 10.1007/s10208-004-0155-9

Keywords

learning theory; Reproducing Kernel Hilbert Space; regularization error; covering number; regularization scheme

Ask authors/readers for more resources

This paper considers the regularized learning algorithm associated with the least-square loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and on the capacity of the reproducing kernel Hilbert space measured by covering numbers. When the kernel is C-infinity and the regression function lies in the corresponding reproducing kernel Hilbert space, the rate is m(-zeta) with zeta arbitrarily close to 1, regardless of the variance of the bounded probability distribution.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available