4.6 Article

A new conjugate gradient method with guaranteed descent and an efficient line search

期刊

SIAM JOURNAL ON OPTIMIZATION
卷 16, 期 1, 页码 170-192

出版社

SIAM PUBLICATIONS
DOI: 10.1137/030601880

关键词

conjugate gradient method; unconstrained optimization; convergence; line search; Wolfe conditions

向作者/读者索取更多资源

A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. With exact line search, our method reduces to a nonlinear version of the Hestenes - Stiefel conjugate gradient scheme. For any ( inexact) line search, our scheme satisfies the descent condition g(k)(T) d(k) <= - 7/8 parallel to g(k)parallel to(2). Moreover, a global convergence result is established when the line search fulfills the Wolfe conditions. A new line search scheme is developed that is efficient and highly accurate. Efficiency is achieved by exploiting properties of linear interpolants in a neighborhood of a local minimizer. High accuracy is achieved by using a convergence criterion, which we call the approximate Wolfe conditions, obtained by replacing the sufficient decrease criterion in the Wolfe conditions with an approximation that can be evaluated with greater precision in a neighborhood of a local minimum than the usual sufficient decrease criterion. Numerical comparisons are given with both L-BFGS and conjugate gradient methods using the unconstrained optimization problems in the CUTE library.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据