4.6 Article

A new, globally convergent Riemannian conjugate gradient method

Journal

OPTIMIZATION
Volume 64, Issue 4, Pages 1011-1031

Publisher

TAYLOR & FRANCIS LTD
DOI: 10.1080/02331934.2013.836650

Keywords

Wolfe conditions; 'scaled' vector transport; global convergence; Riemannian optimization; conjugate gradient method; 49M37; 90C30; 65K05

Funding

  1. Grants-in-Aid for Scientific Research [13J05977] Funding Source: KAKEN

Ask authors/readers for more resources

This article deals with the conjugate gradient method on a Riemannian manifold with interest in global convergence analysis. The existing conjugate gradient algorithms on a manifold endowed with a vector transport need the assumption that the vector transport does not increase the norm of tangent vectors, in order to confirm that generated sequences have a global convergence property. In this article, the notion of a scaled vector transport is introduced to improve the algorithm so that the generated sequences may have a global convergence property under a relaxed assumption. In the proposed algorithm, the transported vector is rescaled in case its norm has increased during the transport. The global convergence is theoretically proved and numerically observed with examples. In fact, numerical experiments show that there exist minimization problems for which the existing algorithm generates divergent sequences, but the proposed algorithm generates convergent sequences.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available