4.7 Article

Robust support vector machine with generalized quantile loss for classification and regression

Journal

APPLIED SOFT COMPUTING
Volume 81, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.asoc.2019.105483

Keywords

Robustness; Quantile; Correntropy; Non-convex loss; Support vector machines

Funding

  1. National Natural Science Foundation of China [11471010, 11271367]

Ask authors/readers for more resources

A new robust loss function (called L-q-loss) is proposed based on the concept of quantile and correntropy, which can be seen as an improved version of quantile loss function. The proposed L-q-loss has some important properties such as asymmetry, non-convexity and boundedness, which has received a lot of attention recently. The L-q-loss includes and extends the traditional loss functions such as pinball loss, rescaled hinge loss, Li-norm loss and zero-norm loss. Additionally, we demonstrate that the L-q-loss is a kernel-induced loss by reproducing piecewise kernel function. Further, two robust SVM frameworks are presented to handle robust classification and regression problems by applying L-q-loss to support vector machine, respectively. Last but not least, we demonstrate that the proposed classification framework satisfies Bayes' optimal decision rule. However, the non-convexity of the proposed L-q-loss makes it difficult to optimize. A non-convex optimization method, concave-convex procedure (CCCP) technique, is used to solve the proposed models, and the convergence of the algorithms is proved theoretically. For classification and regression tasks, experiments are carried out on three databases including UCI benchmark datasets, artificial datasets and a practical application dataset. Compared to some classical and advanced methods, numerical simulations under different noise setting and different evaluation criteria show that the proposed methods have good robustness to feature noise and outliers in both classification and regression applications. (C) 2019 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available