期刊
APPLIED SOFT COMPUTING
卷 80, 期 -, 页码 202-210出版社
ELSEVIER
DOI: 10.1016/j.asoc.2019.03.037
关键词
Support vector machine; Parameter selection; Grid search
Support vector machine (SVM) has been recently considered as one of the most efficient classifiers. However, the time complexity of kernel SVM, which is quadratic in the number of training patterns, makes it impractical to be applied to large data sets. In such a case, the complexity is further increased when an exhaustive grid search is used to find its optimal parameters (the kernel parameters and the penalty parameter, C). To reduce this extra complexity, a novel approach is proposed that prunes the data points by removing the ones that have extremely small chance of becoming support vectors. This is accomplished by using the support vectors obtained from the training of an SVM with a smaller value of C as the training patterns for an SVM with a slightly larger value. This can serve in reducing the grid-search time for the standard SVM and for the approximate methods that search heuristically for a small range of the kernel parameters first. Experiments showed the effectiveness of the proposed approach in reducing the training time for both methods considerably while achieving a similar accuracy to the standard SVM. In addition, the training time of the latter method was faster than that of the evolutionary techniques based on the particle swarm optimization algorithm. (C) 2019 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据