4.6 Article

Global convergence rate analysis of unconstrained optimization methods based on probabilistic models

Journal

MATHEMATICAL PROGRAMMING
Volume 169, Issue 2, Pages 337-375

Publisher

SPRINGER HEIDELBERG
DOI: 10.1007/s10107-017-1137-4

Keywords

Line-search methods; Cubic regularization methods; Random models; Global convergence analysis

Funding

  1. Oxford University EPSRC Platform Grant [EP/I01893X/1]
  2. NSF [DMS 10-16571, DMS 13-19356, CCF-1320137]
  3. AFOSR Grant [FA9550-11-1-0239]
  4. DARPA Grant [FA 9550-12-1-0406]
  5. EPSRC [EP/I01893X/1] Funding Source: UKRI

Ask authors/readers for more resources

We present global convergence rates for a line-search method which is based on random first-order models and directions whose quality is ensured only with certain probability. We show that in terms of the order of the accuracy, the evaluation complexity of such a method is the same as its counterparts that use deterministic accurate models; the use of probabilistic models only increases the complexity by a constant, which depends on the probability of the models being good. We particularize and improve these results in the convex and strongly convex case. We also analyze a probabilistic cubic regularization variant that allows approximate probabilistic second-order models and show improved complexity bounds compared to probabilistic first-order methods; again, as a function of the accuracy, the probabilistic cubic regularization bounds are of the same (optimal) order as for the deterministic case.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available