期刊
MATHEMATICS
卷 11, 期 13, 页码 -出版社
MDPI
DOI: 10.3390/math11132808
关键词
variable selection; regularization; sparsity; support vector regression
类别
In this study, we propose a sparse version of Support Vector Regression (SVR) that achieves sparsity in function estimation through regularization. We introduce an adaptive L-0 penalty with a ridge structure, which does not increase computational complexity. Additionally, we adopt an alternative approach based on a similar proposal in the Support Vector Machine (SVM) literature. Numerical studies confirm the effectiveness of our novel approach, which focuses on variable selection rather than support vector selection.
In this work, we proposed a sparse version of the Support Vector Regression (SVR) algorithm that uses regularization to achieve sparsity in function estimation. To achieve this, we used an adaptive L-0 penalty that has a ridge structure and, therefore, does not introduce additional computational complexity to the algorithm. In addition to this, we used an alternative approach based on a similar proposal in the Support Vector Machine (SVM) literature. Through numerical studies, we demonstrated the effectiveness of our proposals. We believe that this is the first time someone discussed a sparse version of Support Vector Regression (in terms of variable selection and not in terms of support vector selection).
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据