Journal
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS
Volume 60, Issue 2, Pages 343-376Publisher
SPRINGER
DOI: 10.1007/s10589-014-9671-y
Keywords
Smooth unconstrained minimization; Newton's method; Regularization; Global convergence; Local convergence; Computational results
Funding
- CNPq [307714/2011-0, 477611/2013-3, 304032/2010-7, 302962/2011-5, 474996/2013-1]
- FAPESP [2013/05475-7, 2013/07375-0]
- PRONEX Optimization
- FAPERJ [E-26/102.940/2011]
Ask authors/readers for more resources
In this work we propose a class of quasi-Newton methods to minimize a twice differentiable function with Lipschitz continuous Hessian. These methods are based on the quadratic regularization of Newton's method, with algebraic explicit rules for computing the regularizing parameter. The convergence properties of this class of methods are analysed. We show that if the sequence generated by the algorithm converges then its limit point is stationary. We also establish local quadratic convergence in a neighborhood of a stationary point with positive definite Hessian. Encouraging numerical experiments are presented.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available