4.5 Article

Sufficient Descent Riemannian Conjugate Gradient Methods

期刊

出版社

SPRINGER/PLENUM PUBLISHERS
DOI: 10.1007/s10957-021-01874-3

关键词

Riemannian conjugate gradient method; Sufficient descent condition; Strong Wolfe conditions; Line search algorithm

资金

  1. JSPS KAKENHI Grant [JP18K11184]

向作者/读者索取更多资源

This paper introduces two sufficient descent nonlinear conjugate gradient methods and proves their satisfaction of the sufficient descent condition on Riemannian manifolds. The performance of the proposed hybrid methods greatly depends on the type of line search used, while the Hager-Zhang-type method has the fast convergence property regardless of the type of line search used.
This paper considers sufficient descent Riemannian conjugate gradient methods with line search algorithms. We propose two kinds of sufficient descent nonlinear conjugate gradient method and prove that these methods satisfy the sufficient descent condition on Riemannian manifolds. One is a hybrid method combining a Fletcher-Reeves-type method with a Polak-Ribiere-Polyak-type method, and the other is a Hager-Zhang-type method, both of which are generalizations of those used in Euclidean space. Moreover, we prove that the hybrid method has a global convergence property under the strong Wolfe conditions and the Hager-Zhang-type method has the sufficient descent property regardless of whether a line search is used or not. Further, we review two kinds of line search algorithm on Riemannian manifolds and numerically compare our generalized methods by solving several Riemannian optimization problems. The results show that the performance of the proposed hybrid methods greatly depends on the type of line search used. Meanwhile, the Hager-Zhang-type method has the fast convergence property regardless of the type of line search used.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据