期刊
ANNALS OF APPLIED STATISTICS
卷 1, 期 2, 页码 302-332出版社
INST MATHEMATICAL STATISTICS
DOI: 10.1214/07-AOAS131
关键词
Coordinate descent; lasso; convex optimization
资金
- NSF [DMS-97-64411, DMS-05-50670, DMS-99-71405]
- NIH [2R01 CA 72028-07, N01-HV-28183]
- Albion Walter Hewlett Stanford Graduate Fellowship
We consider one-at-a-time coordinate-wise descent algorithms for a class of convex optimization problems. An algorithm of this kind has been proposed for the L-1-penalized regression (lasso) in the literature, but it seems to have been largely ignored. Indeed. it seems (hat coordinate-wise algorithms are not often Used in convex optimization. We show that this algorithm is very competitive with the well-known LARS (or homotopy) procedure in large lasso problems, and that it call be applied to related methods such as the garotte and elastic net. It turns out that coordinate-wise descent does not work in the Fused lasso. however. so we derive a generalized algorithm that yields the solution in much less time that a standard convex optimizer. Finally. we generalize the procedure to the two-dimensional fused lasso, and demonstrate its performance oil some image smoothing problems.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据