4.7 Article

A fast parameter optimization approach based on the inter-cluster induced distance in the feature space for support vector machines

Journal

APPLIED SOFT COMPUTING
Volume 118, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.asoc.2022.108519

Keywords

Support vector machine; Parameter optimization; Inter-cluster induced distance Kernel parameter Modified golden section algorithm

Funding

  1. Key scientific research project of Henan higher education institutions, China [20A413006]
  2. Programs Foundation of Henan Polytechnic University [760807/006]
  3. National Natural Science Foundation of China [61973105]
  4. In-novative Scientists and Technicians Team of Henan Provincial High Education [20IRTSTHN019]
  5. Innovative Scientists and Technicians Team of Henan Polytechnic University [T2019-2]
  6. Henan Province Scientific and Techno-logical Project of China [212102210197, 212102210145]

Ask authors/readers for more resources

This paper proposes a new method for optimizing the kernel and penalty parameters of SVM classifiers. The method introduces a new distance measure in the feature space and presents a fast parameter optimization approach that significantly reduces training time while maintaining competitive model accuracy.
This paper focuses on the problem of optimizing the kernel and penalty parameters for SVM classifiers with Gaussian kernel. To reduce the computational overhead of inter-cluster distance in the feature space (ICDF) with a large number of candidate discretized values in a large interval in previous researches, in this paper, the new inter-cluster induced distance in the feature space (ICIDF) is proposed to guide the kernel parameter selection of SVMs, and the theorem that the ICIDF is a positive strictly unimodal function about Gaussian kernel parameter is firstly presented. Then, a fast parameter optimization approach including two stages is presented for SVMs according to this theorem. In the first stage, a modified golden section algorithm (MGSA) is proposed to obtain a shrunk value interval for kernel parameter in small amount of ICIDF calculations. In the second stage, a differential evolutionary algorithm (BBDE or SADE) is applied to select the best parameter combination for SVM in the shrunk interval of kernel parameter obtained by MGSA and a given interval of penalty parameter. Experiments for benchmark datasets illustrate that the training time of SVM models can significantly shortened by our approach, while the testing accuracy of the trained SVMs is competitive. (C)& nbsp;2022 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available