4.6 Article

Efficiency of minimizing compositions of convex functions and smooth maps

Journal

MATHEMATICAL PROGRAMMING
Volume 178, Issue 1-2, Pages 503-558

Publisher

SPRINGER HEIDELBERG
DOI: 10.1007/s10107-018-1311-3

Keywords

Composite minimization; Fast gradient methods; Gauss-Newton; Prox-gradient; Inexactness; Complexity; Smoothing; Incremental methods; Acceleration

Funding

  1. AFOSR [FA9550-15-1-0237]
  2. NSF [DMS 1651851]

Ask authors/readers for more resources

We consider global efficiency of algorithms for minimizing a sum of a convex function and a composition of a Lipschitz convex function with a smooth map. The basic algorithm we rely on is the prox-linear method, which in each iteration solves a regularized subproblem formed by linearizing the smooth map. When the subproblems are solved exactly, the method has efficiency O(epsilon(-2)), akin to gradient descent for smooth minimization. We show that when the subproblems can only be solved by first-order methods, a simple combination of smoothing, the prox-linear method, and a fast-gradient scheme yields an algorithm with complexity (O) over tilde(epsilon(-3)). We round off the paper with an inertial prox-linear method that automatically accelerates in presence of convexity.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available