4.5 Article

Multi-kernel regularized classifiers

期刊

JOURNAL OF COMPLEXITY
卷 23, 期 1, 页码 108-134

出版社

ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1016/j.jco.2006.06.007

关键词

classification algorithm; multi-kernel regularization scheme; convex loss function; misclassification error; regularization error and sample error

向作者/读者索取更多资源

A family of classification algorithms generated from Tikhonov regularization schemes are considered. They involve mufti-kernel spaces and general convex loss functions. Our main purpose is to provide satisfactory estimates for the excess misclassification error of these mufti-kernel regularized classifiers when the loss functions achieve the zero value. The error analysis consists of two parts: regularization error and sample error. Allowing mufti-kernels in the algorithm improves the regularization error and approximation error, which is one advantage of the mufti-kernel setting. For a general loss function, we show how to bound the regularization error by the approximation in some weighted L-q spaces. For the sample error, we use a projection operator. The projection in connection with the decay of the regularization error enables us to improve convergence rates in the literature even for the one-kernel schemes and special loss functions: least-square loss and hinge loss for support vector machine soft margin classifiers. Existence of the optimization problem for the regularization scheme associated with multi-kernels is verified when the kernel functions are continuous with respect to the index set. Concrete examples, including Gaussian kernels with flexible variances and probability distributions with some noise conditions, are used to illustrate the general theory. (c) 2006 Elsevier Inc. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据