4.4 Article

Sparse Convex Regression

期刊

INFORMS JOURNAL ON COMPUTING
卷 33, 期 1, 页码 262-279

出版社

INFORMS
DOI: 10.1287/ijoc.2020.0954

关键词

statistics; regression; linear programming; large-scale systems; applications

向作者/读者索取更多资源

The study addresses the problem of best k-subset convex regression using n observations in d variables, presenting scalable algorithms for both sparse and non-sparse cases. The algorithms show promising results when compared to state-of-the-art methods, providing high-quality solutions in practical times. Additionally, the methods are effective in controlling the false discovery rate.
We consider the problem of best k-subset convex regression using n observations in d variables. For the case without sparsity, we develop a scalable algorithm for obtaining high quality solutions in practical times that compare favorably with other state of the art methods. We show that by using a cutting plane method, the least squares convex regression problem can be solved for sizes (n,d) = (10(4),10) in minutes and (n,d) = (10(5) ,10(2)) in hours. Our algorithm can be adapted to solve variants such as finding the best convex or concave functions with coordinate-wise monotonicity, norm-bounded subgradients, and minimize the l(1) loss-all with similar scalability to the least squares convex regression problem. Under sparsity, we propose algorithms which iteratively solve for the best subset of features based on first order and cutting plane methods. We show that our methods scale for sizes (n,d,k = 10(4),10(2) ,10) in minutes and (n,d, k = 10(5),10(2),10) in hours. We demonstrate that these methods control for the false discovery rate effectively.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.4
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据