4.6 Article

SCALABLE ALGORITHMS FOR THE SPARSE RIDGE REGRESSION

期刊

SIAM JOURNAL ON OPTIMIZATION
卷 30, 期 4, 页码 3359-3386

出版社

SIAM PUBLICATIONS
DOI: 10.1137/19M1245414

关键词

approximation algorithm; chance constraint; conic program; mixed integer; ridge regression

向作者/读者索取更多资源

Sparse regression and variable selection for large-scale data have been rapidly developed in the past decades. This work focuses on sparse ridge regression, which enforces the sparsity by use of the L-0 norm. We first prove that the continuous relaxation of the mixed integer second order conic (MISOC) reformulation using perspective formulation is equivalent to that of the convex integer formulation proposed in recent work. We also show that the convex hull of the constraint system of the MISOC formulation is equal to its continuous relaxation. Based upon these two formulations (i.e., the MISOC formulation and convex integer formulation), we analyze two scalable algorithms, the greedy and randomized algorithms, for sparse ridge regression with desirable theoretical properties. The proposed algorithms are proved to yield near-optimal solutions under mild conditions. We further propose integrating the greedy algorithm with the randomized algorithm, which can greedily search the features from the nonzero subset identified by the continuous relaxation of the MISOC formulation. The merits of the proposed methods are illustrated through numerical examples in comparison with several existing ones.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据