4.5 Article

Adaptive restart of accelerated gradient methods under local quadratic growth condition

Journal

IMA JOURNAL OF NUMERICAL ANALYSIS
Volume 39, Issue 4, Pages 2069-2095

Publisher

OXFORD UNIV PRESS
DOI: 10.1093/imanum/drz007

Keywords

accelerated gradient descent; restarting; quadratic growth condition; unknown error bound

Funding

  1. EPSRC [EP/K02325X/1]
  2. Centre for Numerical Algorithms and Intelligent Software [EP/G036136/1]
  3. Centre for Numerical Algorithms and Intelligent Software (Scottish Funding Council)
  4. Orange/Telecom ParisTech think tank Phi-TAB
  5. ANR (LabEx LMH as part of the Investissement d'avenir project) [ANR-11-LABX-0056-LMH]
  6. Hong Kong Research Grants Council [27303016]
  7. Hong Kong UGC Special Equipment Grant [SEG HKU09]
  8. EPSRC [EP/K02325X/1] Funding Source: UKRI

Ask authors/readers for more resources

By analyzing accelerated proximal gradient methods under a local quadratic growth condition, we show that restarting these algorithms at any frequency gives a globally linearly convergent algorithm. This result was previously known only for long enough frequencies. Then as the rate of convergence depends on the match between the frequency and the quadratic error bound, we design a scheme to automatically adapt the frequency of restart from the observed decrease of the norm of the gradient mapping. Our algorithm has a better theoretical bound than previously proposed methods for the adaptation to the quadratic error bound of the objective. We illustrate the efficiency of the algorithm on Lasso, regularized logistic regression and total variation denoising problems.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available