4.5 Article

Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization

期刊

出版社

SPRINGER/PLENUM PUBLISHERS
DOI: 10.1007/s10957-008-9458-3

关键词

Nonsmooth optimization; Linear constraints; Support vector machines; Bilevel optimization; l(1)-regularization; Coordinate gradient descent; Global convergence; Linear convergence rate; Complexity bound

资金

  1. National Science Foundation [DMS-0511283]

向作者/读者索取更多资源

We consider the problem of minimizing the weighted sum of a smooth function f and a convex function P of n real variables subject to m linear equality constraints. We propose a block-coordinate gradient descent method for solving this problem, with the coordinate block chosen by a Gauss-Southwell-q rule based on sufficient predicted descent. We establish global convergence to first-order stationarity for this method and, under a local error bound assumption, linear rate of convergence. If f is convex with Lipschitz continuous gradient, then the method terminates in O(n (2)/epsilon) iterations with an epsilon-optimal solution. If P is separable, then the Gauss-Southwell-q rule is implementable in O(n) operations when m=1 and in O(n (2)) operations when m > 1. In the special case of support vector machines training, for which f is convex quadratic, P is separable, and m=1, this complexity bound is comparable to the best known bound for decomposition methods. If f is convex, then, by gradually reducing the weight on P to zero, the method can be adapted to solve the bilevel problem of minimizing P over the set of minima of f+delta (X) , where X denotes the closure of the feasible set. This has application in the least 1-norm solution of maximum-likelihood estimation.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据