4.4 Article

COORDINATE DESCENT ALGORITHMS FOR LASSO PENALIZED REGRESSION

期刊

ANNALS OF APPLIED STATISTICS
卷 2, 期 1, 页码 224-244

出版社

INST MATHEMATICAL STATISTICS
DOI: 10.1214/07-AOAS147

关键词

Model selection; Edgeworth's algorithm; cyclic; greedy; consistency; convergence

资金

  1. NIH [GM53275, MH59490]

向作者/读者索取更多资源

Imposition of a lasso penalty shrinks parameter estimates toward zero and performs continuous model selection. Lasso penalized regression is capable of handling linear regression problems where the number of predictors far exceeds the number of cases. This paper tests two exceptionally fast algorithms for estimating regression coefficients with a lasso penalty. The previously known l(2) algorithm is based on cyclic coordinate descent. Our new C, algorithm is based oil greedy coordinate descent and Edgeworth's algorithm for ordinary l(1) regression. Each algorithm relies on a tuning constant that can be chosen by cross-validation. In some regression problems it is natural to group parameters and penalize parameters group by group rather than separately. If the group penalty is proportional to the Euclidean norin of the parameters of the group, then it is possible to majorize the norm and reduce parameter estimation to l(2) regression with a lasso penalty. Thus, the existing algorithm can be extended to novel settings. Each of the algorithms discussed is tested via either simulated or real data or both. The Appendix proves that a greedy form of the l(2) algorithm converges to the minimum value of the objective function.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.4
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据