Journal
MATHEMATICAL PROGRAMMING
Volume 112, Issue 1, Pages 159-181Publisher
SPRINGER
DOI: 10.1007/s10107-006-0089-x
Keywords
convex optimization; unconstrained minimization; Newton's method; cubic regularization; worst-case complexity; global complexity bounds; non-degenerate problems; condition number
Ask authors/readers for more resources
In this paper we propose an accelerated version of the cubic regularization of Newton's method ( Nesterov and Polyak, in Math Program 108( 1): 177 - 205, 2006). The original version, used for minimizing a convex function with Lipschitz- continuous Hessian, guarantees a global rate of convergence of order O(1/k(2)), where k is the iteration counter. Our modified version converges for the same problem class with order O(1/k(3)) , keeping the complexity of each iteration unchanged. We study the complexity of both schemes on different classes of convex problems. In particular, we argue that for the second- order schemes, the class of non- degenerate problems is different from the standard class.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available