4.3 Article

A novel differential evolution algorithm integrating opposition-based learning and adjacent two generations hybrid competition for parameter selection of SVM

期刊

EVOLVING SYSTEMS
卷 12, 期 1, 页码 207-215

出版社

SPRINGER HEIDELBERG
DOI: 10.1007/s12530-019-09313-5

关键词

Support vector machines; Differential evolution algorithm; Opposition-based learning; Adjacent two generations hybrid competition; Parameter optimization

资金

  1. National Natural Science Foundation of China [61572381]
  2. Hubei Province Key Laboratory of Intelligent Information Processing and Real-time Industrial System (Wuhan University of Science and Technology) [znxx2018QN06]

向作者/读者索取更多资源

The proposed DGODE-SVM algorithm improves parameter selection for SVM by integrating opposition-based learning and hybrid competition between adjacent two generations. Experimental results demonstrate that it outperforms other algorithms in terms of classification accuracy.
Generalization performance of support vector machines (SVM) with Gaussian kernel is influenced by its model parameters, both the error penalty parameter and the Gaussian kernel parameter. The differential evolution (DE) algorithms have strong search ability and easy to implement. But it falls into local optimum easily. Hence a novel differential evolution algorithm which integrating opposition-based learning and hybrid competition between adjacent two generations is put forward for parameter selection of SVM (DGODE-SVM). In DGODE-SVM algorithm, opposition-based learning and hybrid competition between adjacent two generations are inserted into the differential evolution process. Nineteen experimental results on UCI datasets distinctly show that, compared with ODE-SVM, SaDE-SVM, DE-SVM, SVM, C4.5, KNN and NB algorithms, the proposed algorithm has higher classification accuracy.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.3
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据