4.5 Article Proceedings Paper

On the generalization of soft margin algorithms

期刊

IEEE TRANSACTIONS ON INFORMATION THEORY
卷 48, 期 10, 页码 2721-2735

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2002.802647

关键词

generalization; margin; margin distribution; neural networks; probably approximately correct (pac) learning; ridge regression; soft margin; statistical learning; support vector machines (SVMs)

向作者/读者索取更多资源

Generalization bounds depending on the margin of a classifier are a relatively recent development. They provide an ex. planation of the performance of state-of-the-art learning systems such as support vector machines (SVMs) [1] and Adaboost [2]. The difficulty with these bounds has been either their lack of robustness or their looseness. The question of whether the generalization of a classifier can be more tightly bounded in terms of a robust measure of the distribution of margin values has remained open for some time. The paper answers this open question in the affirmative and, furthermore, the analysis leads to bounds that motivate the previously heuristic soft margin SVM algorithms as well as justifying the use of the quadratic loss in neural network training algorithms. The results are extended to give bounds for the probability of failing to achieve a target accuracy in regression prediction, with a statistical analysis of ridge regression and Gaussian processes as a,special case. The analysis presented in the paper has also lead to,new boosting algorithms described elsewhere.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据