4.5 Article

Training robust support vector regression with smooth non-convex loss function

Journal

OPTIMIZATION METHODS & SOFTWARE
Volume 27, Issue 6, Pages 1039-1058

Publisher

TAYLOR & FRANCIS LTD
DOI: 10.1080/10556788.2011.557725

Keywords

support vector machine; regression; loss function; robustness; d.c. optimization; newton method

Funding

  1. National Natural Science Foundation of China [70601033]

Ask authors/readers for more resources

The classical support vector machines are constructed based on convex loss functions. Recently, support vector machines with non-convex loss functions have attracted much attention for their superiority to the classical ones in generalization accuracy and robustness. In this paper, we propose a non-convex loss function to construct a robust support vector regression (SVR). The introduced non-convex loss function includes several truncated loss functions as its special cases. The resultant optimization problem is a difference of convex functions program. We employ the concave-convex procedure and develop a Newton-type algorithm to solve it, which can both retain the sparseness of SVR and oppress outliers in the training samples. The experiments on both synthetic and real-world benchmark data sets confirm the robustness and effectiveness of the proposed method.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available