Journal
IMA JOURNAL OF NUMERICAL ANALYSIS
Volume 39, Issue 4, Pages 2069-2095Publisher
OXFORD UNIV PRESS
DOI: 10.1093/imanum/drz007
Keywords
accelerated gradient descent; restarting; quadratic growth condition; unknown error bound
Categories
Funding
- EPSRC [EP/K02325X/1]
- Centre for Numerical Algorithms and Intelligent Software [EP/G036136/1]
- Centre for Numerical Algorithms and Intelligent Software (Scottish Funding Council)
- Orange/Telecom ParisTech think tank Phi-TAB
- ANR (LabEx LMH as part of the Investissement d'avenir project) [ANR-11-LABX-0056-LMH]
- Hong Kong Research Grants Council [27303016]
- Hong Kong UGC Special Equipment Grant [SEG HKU09]
- EPSRC [EP/K02325X/1] Funding Source: UKRI
Ask authors/readers for more resources
By analyzing accelerated proximal gradient methods under a local quadratic growth condition, we show that restarting these algorithms at any frequency gives a globally linearly convergent algorithm. This result was previously known only for long enough frequencies. Then as the rate of convergence depends on the match between the frequency and the quadratic error bound, we design a scheme to automatically adapt the frequency of restart from the observed decrease of the norm of the gradient mapping. Our algorithm has a better theoretical bound than previously proposed methods for the adaptation to the quadratic error bound of the objective. We illustrate the efficiency of the algorithm on Lasso, regularized logistic regression and total variation denoising problems.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available