4.4 Article

PATHWISE COORDINATE OPTIMIZATION

期刊

ANNALS OF APPLIED STATISTICS
卷 1, 期 2, 页码 302-332

出版社

INST MATHEMATICAL STATISTICS
DOI: 10.1214/07-AOAS131

关键词

Coordinate descent; lasso; convex optimization

资金

  1. NSF [DMS-97-64411, DMS-05-50670, DMS-99-71405]
  2. NIH [2R01 CA 72028-07, N01-HV-28183]
  3. Albion Walter Hewlett Stanford Graduate Fellowship

向作者/读者索取更多资源

We consider one-at-a-time coordinate-wise descent algorithms for a class of convex optimization problems. An algorithm of this kind has been proposed for the L-1-penalized regression (lasso) in the literature, but it seems to have been largely ignored. Indeed. it seems (hat coordinate-wise algorithms are not often Used in convex optimization. We show that this algorithm is very competitive with the well-known LARS (or homotopy) procedure in large lasso problems, and that it call be applied to related methods such as the garotte and elastic net. It turns out that coordinate-wise descent does not work in the Fused lasso. however. so we derive a generalized algorithm that yields the solution in much less time that a standard convex optimizer. Finally. we generalize the procedure to the two-dimensional fused lasso, and demonstrate its performance oil some image smoothing problems.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.4
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据