4.7 Article

Speed up grid-search for parameter selection of support vector machines

期刊

APPLIED SOFT COMPUTING
卷 80, 期 -, 页码 202-210

出版社

ELSEVIER
DOI: 10.1016/j.asoc.2019.03.037

关键词

Support vector machine; Parameter selection; Grid search

向作者/读者索取更多资源

Support vector machine (SVM) has been recently considered as one of the most efficient classifiers. However, the time complexity of kernel SVM, which is quadratic in the number of training patterns, makes it impractical to be applied to large data sets. In such a case, the complexity is further increased when an exhaustive grid search is used to find its optimal parameters (the kernel parameters and the penalty parameter, C). To reduce this extra complexity, a novel approach is proposed that prunes the data points by removing the ones that have extremely small chance of becoming support vectors. This is accomplished by using the support vectors obtained from the training of an SVM with a smaller value of C as the training patterns for an SVM with a slightly larger value. This can serve in reducing the grid-search time for the standard SVM and for the approximate methods that search heuristically for a small range of the kernel parameters first. Experiments showed the effectiveness of the proposed approach in reducing the training time for both methods considerably while achieving a similar accuracy to the standard SVM. In addition, the training time of the latter method was faster than that of the evolutionary techniques based on the particle swarm optimization algorithm. (C) 2019 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据