4.5 Article

Consistency of support vector machines and other regularized kernel classifiers

Journal

IEEE TRANSACTIONS ON INFORMATION THEORY
Volume 51, Issue 1, Pages 128-142

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2004.839514

Keywords

computational learning theory; kernel methods; pattern recognition; regularization; support vector machines (SVMs); universal consistency

Ask authors/readers for more resources

It is shown that various classifiers that are based on minimization of a regularized risk are universally consistent, i.e., they can asymptotically learn in every classification task. The role of the loss functions used in these algorithms is considered in detail. As an application of our general framework, several types of support vector machines (SVMs) as well as regularization networks are treated. Our methods combine techniques from stochastics, approximation theory, and functional analysis.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available