4.5 Article

Algorithm 851: CG DESCENT, a conjugate gradient method with guaranteed descent

期刊

ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE
卷 32, 期 1, 页码 113-137

出版社

ASSOC COMPUTING MACHINERY
DOI: 10.1145/1132973.1132979

关键词

algorithms; conjugate gradient method; convergence; line search; unconstrained optimization; Wolfe conditions

向作者/读者索取更多资源

Recently, a new nonlinear conjugate gradient scheme was developed which satisfies the descent condition g(k)(inverted perpendicular)dk <= - 7/8 parallel to g(k)parallel to(2) and which is globally convergent whenever the line search fulfills the Wolfe conditions. This article studies the convergence behavior of the algorithm; extensive numerical tests and comparisons with other methods for large-scale unconstrained optimization are given.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据