4.6 Article

A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions

Journal

COMPUTATIONAL OPTIMIZATION AND APPLICATIONS
Volume 64, Issue 1, Pages 101-118

Publisher

SPRINGER
DOI: 10.1007/s10589-015-9801-1

Keywords

Riemannian optimization; Conjugate gradient method; Global convergence; Weak Wolfe conditions; Scaled vector transport

Funding

  1. JSPS (Japan Society for the Promotion of Science) KAKENHI [26887037]
  2. Grants-in-Aid for Scientific Research [26887037] Funding Source: KAKEN

Ask authors/readers for more resources

This article describes a new Riemannian conjugate gradient method and presents a global convergence analysis. The existing Fletcher-Reeves-type Riemannian conjugate gradient method is guaranteed to be globally convergent if it is implemented with the strong Wolfe conditions. On the other hand, the Dai-Yuan-type Euclidean conjugate gradient method generates globally convergent sequences under the weak Wolfe conditions. This article deals with a generalization of Dai-Yuan's Euclidean algorithm to a Riemannian algorithm that requires only the weak Wolfe conditions. The global convergence property of the proposed method is proved by means of the scaled vector transport associated with the differentiated retraction. The results of numerical experiments demonstrate the effectiveness of the proposed algorithm.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available