4.7 Article

Adaptive L0 Regularization for Sparse Support Vector Regression

期刊

MATHEMATICS
卷 11, 期 13, 页码 -

出版社

MDPI
DOI: 10.3390/math11132808

关键词

variable selection; regularization; sparsity; support vector regression

向作者/读者索取更多资源

In this study, we propose a sparse version of Support Vector Regression (SVR) that achieves sparsity in function estimation through regularization. We introduce an adaptive L-0 penalty with a ridge structure, which does not increase computational complexity. Additionally, we adopt an alternative approach based on a similar proposal in the Support Vector Machine (SVM) literature. Numerical studies confirm the effectiveness of our novel approach, which focuses on variable selection rather than support vector selection.
In this work, we proposed a sparse version of the Support Vector Regression (SVR) algorithm that uses regularization to achieve sparsity in function estimation. To achieve this, we used an adaptive L-0 penalty that has a ridge structure and, therefore, does not introduce additional computational complexity to the algorithm. In addition to this, we used an alternative approach based on a similar proposal in the Support Vector Machine (SVM) literature. Through numerical studies, we demonstrated the effectiveness of our proposals. We believe that this is the first time someone discussed a sparse version of Support Vector Regression (in terms of variable selection and not in terms of support vector selection).

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据