Journal
SIAM JOURNAL ON OPTIMIZATION
Volume 22, Issue 2, Pages 533-556Publisher
SIAM PUBLICATIONS
DOI: 10.1137/090780705
Keywords
convex optimization; variable splitting; alternating direction augmented Lagrangian method; alternating linearization method; complexity theory; decomposition; smoothing techniques; parallel computing; proximal point algorithm; optimal gradient method
Categories
Funding
- NSF [DMS 06-06712, DMS 10-16571]
- ONR [N00014-08-1-1118]
- DOE [DE-FG02-08ER25856]
- Division Of Mathematical Sciences
- Direct For Mathematical & Physical Scien [1016571] Funding Source: National Science Foundation
Ask authors/readers for more resources
We present in this paper two different classes of general multiple-splitting algorithms for solving finite-dimensional convex optimization problems. Under the assumption that the function being minimized can be written as the sum of K convex functions, each of which has a Lipschitz continuous gradient, we prove that the number of iterations needed by the first class of algorithms to obtain an epsilon-optimal solution is O((K - 1)L/epsilon), where L is an upper bound on all of the Lipschitz constants. The algorithms in the second class are accelerated versions of those in the first class, where the complexity result is improved to O(root(K - 1)L/epsilon) while the computational effort required at each iteration is almost unchanged. To the best of our knowledge, the complexity results presented in this paper are the first ones of this type that have been given for splitting and alternating direction-type methods. Moreover, all algorithms proposed in this paper are parallelizable, which makes them particularly attractive for solving certain large-scale problems.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available