4.7 Article

Valley-loss regular simplex support vector machine for robust multiclass classification

期刊

KNOWLEDGE-BASED SYSTEMS
卷 216, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.knosys.2021.106801

关键词

Feature noise and outlier labels; Robust K-class classifier; Sparseness; Valley-loss function; Regular simplex support vector machine

资金

  1. National Natural Science Foundation Project, China [61503085]
  2. Guangdong Natural Science Foundation, China [2017A030313348]
  3. National Natural Science Foundation, China Key Project Subproject [71731009]
  4. China Scholarship Council Fund [201708440002]
  5. Paul and Heidi Brown Preeminent Professorship at the Department of Industrial and Systems Engineering, University of Florida (USA)
  6. Humboldt Research Award (Germany)

向作者/读者索取更多资源

A newly proposed valley-loss regular simplex support vector machine (V-RSSVM) is presented in this paper for robust multiclass classification, with robustness to feature noise and outlier labels, as well as excellent sparseness. To train the V-RSSVM fast, a speeding up oriented initial solution strategy and solver were developed.
Noise and outlier data processing are important issues to support vector machine (SVM). Although the pinball-loss SVM (Pin-SVM) and ramp-loss SVM (Ramp-SVM) are able to deal with the feature noise and outlier labels respectively, neither can handle both and promoting them from binary-classification to multiclass classification usually requires partitioning strategies. Since regular simplex support vector machine (RSSVM) has been proposed as a novel all-in-one K-classification model with clear advantages over partitioning strategies, developing a novel loss function with feature noise robustness and outlier labels insensitivity meanwhile and embedding it into the framework of RSSVM is potentially promising. In this paper, a newly proposed valley-loss regular simplex support vector machine (V-RSSVM) for robust multiclass classification is presented. Inheriting the merits of both the pinball-type loss and ramp-type loss, valley-loss enjoys not only the robustness to feature noise and outlier labels but also excellent sparseness. To train the V-RSSVM fast, a Concave-Convex Procedure (CCCP) assisted sequential minimization optimization (SMO)-type solver and a speeding up oriented initial solution strategy were developed. We also investigated the robustness, generalization error bound and sparseness of V-RSSVM in theory. Numerical results on twenty-five real-life data sets verify the effectiveness of our proposed V-RSSVM model. (C) 2021 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据