期刊
MATHEMATICS
卷 11, 期 17, 页码 -出版社
MDPI
DOI: 10.3390/math11173721
关键词
support vector machine; feature selection; sparse optimization; multi-objective optimization problems; multi-objective machine learning
类别
This article discusses the design of linear Support Vector Machine (SVM) classification techniques as multi-objective optimization problems. The authors focus on applying sparse optimization to feature selection for multi-objective optimization linear SVM. They emphasize the advantages of considering linear SVM classification techniques as multi-objective optimization problems.
The design of linear Support Vector Machine (SVM) classification techniques is generally a Multi-objective Optimization Problem (MOP). These classification techniques require finding appropriate trade-offs between two objectives, such as the amount of misclassified training data (classification error) and the number of non-zero elements of the separator hyperplane. In this article, we review several linear SVM classification models in the form of multi-objective optimization. We put particular emphasis on applying sparse optimization (in terms of minimization of the number of non-zero elements of the separator hyperplane) to Feature Selection (FS) for multi-objective optimization linear SVM. Our primary purpose is to demonstrate the advantages of considering linear SVM classification techniques as MOPs. In multi-objective cases, we can obtain a set of Pareto optimal solutions instead of one optimal solution in single-objective cases. The results of these linear SVMs are reported on some classification datasets. The test problems are specifically designed to challenge the number of non-zero components of the normal vector of the separator hyperplane. We used these datasets for multi-objective and single-objective models.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据