4.7 Article

Adaptive L0 Regularization for Sparse Support Vector Regression

Journal

MATHEMATICS
Volume 11, Issue 13, Pages -

Publisher

MDPI
DOI: 10.3390/math11132808

Keywords

variable selection; regularization; sparsity; support vector regression

Categories

Ask authors/readers for more resources

In this study, we propose a sparse version of Support Vector Regression (SVR) that achieves sparsity in function estimation through regularization. We introduce an adaptive L-0 penalty with a ridge structure, which does not increase computational complexity. Additionally, we adopt an alternative approach based on a similar proposal in the Support Vector Machine (SVM) literature. Numerical studies confirm the effectiveness of our novel approach, which focuses on variable selection rather than support vector selection.
In this work, we proposed a sparse version of the Support Vector Regression (SVR) algorithm that uses regularization to achieve sparsity in function estimation. To achieve this, we used an adaptive L-0 penalty that has a ridge structure and, therefore, does not introduce additional computational complexity to the algorithm. In addition to this, we used an alternative approach based on a similar proposal in the Support Vector Machine (SVM) literature. Through numerical studies, we demonstrated the effectiveness of our proposals. We believe that this is the first time someone discussed a sparse version of Support Vector Regression (in terms of variable selection and not in terms of support vector selection).

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available