4.6 Article

A coordinate gradient descent method for nonsmooth separable minimization

期刊

MATHEMATICAL PROGRAMMING
卷 117, 期 1-2, 页码 387-423

出版社

SPRINGER HEIDELBERG
DOI: 10.1007/s10107-007-0170-0

关键词

error bound; global convergence; linear convergence rate; nonsmooth optimization; coordinate descent

向作者/读者索取更多资源

We consider the problem of minimizing the sum of a smooth function and a separable convex function. This problem includes as special cases bound-constrained optimization and smooth optimization with l(1)-regularization. We propose a (block) coordinate gradient descent method for solving this class of nonsmooth separable problems. We establish global convergence and, under a local Lipschitzian error bound assumption, linear convergence for this method. The local Lipschitzian error bound holds under assumptions analogous to those for constrained smooth optimization, e.g., the convex function is polyhedral and the smooth function is (nonconvex) quadratic or is the composition of a strongly convex function with a linear mapping. We report numerical experience with solving the l(1)-regularization of unconstrained optimization problems from More et al. in ACM Trans. Math. Softw. 7, 17-41, 1981 and from the CUTEr set (Gould and Orban in ACM Trans. Math. Softw. 29, 373-394, 2003). Comparison with L-BFGS-B and MINOS, applied to a reformulation of the l(1)-regularized problem as a bound-constrained optimization problem, is also reported.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据