4.7 Article

Robust support vector machine with generalized quantile loss for classification and regression

期刊

APPLIED SOFT COMPUTING
卷 81, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.asoc.2019.105483

关键词

Robustness; Quantile; Correntropy; Non-convex loss; Support vector machines

资金

  1. National Natural Science Foundation of China [11471010, 11271367]

向作者/读者索取更多资源

A new robust loss function (called L-q-loss) is proposed based on the concept of quantile and correntropy, which can be seen as an improved version of quantile loss function. The proposed L-q-loss has some important properties such as asymmetry, non-convexity and boundedness, which has received a lot of attention recently. The L-q-loss includes and extends the traditional loss functions such as pinball loss, rescaled hinge loss, Li-norm loss and zero-norm loss. Additionally, we demonstrate that the L-q-loss is a kernel-induced loss by reproducing piecewise kernel function. Further, two robust SVM frameworks are presented to handle robust classification and regression problems by applying L-q-loss to support vector machine, respectively. Last but not least, we demonstrate that the proposed classification framework satisfies Bayes' optimal decision rule. However, the non-convexity of the proposed L-q-loss makes it difficult to optimize. A non-convex optimization method, concave-convex procedure (CCCP) technique, is used to solve the proposed models, and the convergence of the algorithms is proved theoretically. For classification and regression tasks, experiments are carried out on three databases including UCI benchmark datasets, artificial datasets and a practical application dataset. Compared to some classical and advanced methods, numerical simulations under different noise setting and different evaluation criteria show that the proposed methods have good robustness to feature noise and outliers in both classification and regression applications. (C) 2019 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据