4.1 Article

The analysis of decomposition methods for support vector machines

期刊

IEEE TRANSACTIONS ON NEURAL NETWORKS
卷 11, 期 4, 页码 1003-1008

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/72.857780

关键词

decomposition methods; projected gradients; support vector machines

向作者/读者索取更多资源

The support vector machine (SVM) is a new and promising technique for pattern recognition. It requires the solution of a large dense quadratic programming problem. Traditional optimization methods cannot be directly applied due to memory restrictions. Up to now, very few methods can handle the memory problem and an important one is the decomposition method. However, there is no convergence proof so far. In this paper, we connect this method to projected gradient methods and provide theoretical proofs for a version of decomposition methods. An extension to bound-constrained formulation of SVM is also provided. We then show that this convergence proof is valid for general decomposition methods if their working set selection meets a simple requirement.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.1
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据