期刊
SIAM JOURNAL ON OPTIMIZATION
卷 16, 期 1, 页码 170-192出版社
SIAM PUBLICATIONS
DOI: 10.1137/030601880
关键词
conjugate gradient method; unconstrained optimization; convergence; line search; Wolfe conditions
A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. With exact line search, our method reduces to a nonlinear version of the Hestenes - Stiefel conjugate gradient scheme. For any ( inexact) line search, our scheme satisfies the descent condition g(k)(T) d(k) <= - 7/8 parallel to g(k)parallel to(2). Moreover, a global convergence result is established when the line search fulfills the Wolfe conditions. A new line search scheme is developed that is efficient and highly accurate. Efficiency is achieved by exploiting properties of linear interpolants in a neighborhood of a local minimizer. High accuracy is achieved by using a convergence criterion, which we call the approximate Wolfe conditions, obtained by replacing the sufficient decrease criterion in the Wolfe conditions with an approximation that can be evaluated with greater precision in a neighborhood of a local minimum than the usual sufficient decrease criterion. Numerical comparisons are given with both L-BFGS and conjugate gradient methods using the unconstrained optimization problems in the CUTE library.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据