3.8 Proceedings Paper

Decentralized Riemannian Gradient Descent on the Stiefel Manifold

Journal

Publisher

JMLR-JOURNAL MACHINE LEARNING RESEARCH

Keywords

-

Funding

  1. National Science Foundation [ECCS-1933878]
  2. Air Force Office of Scientific Research [19RT0424]

Ask authors/readers for more resources

This paper addresses distributed non-convex optimization problem on the Stiefel manifold, proposing two decentralized algorithms, DRSGD and DRGTA, with convergence rates of O(1/root K) and O(1/K) respectively. Multi-step consensus is used to maintain iteration in the local consensus region. The DRGTA is the first decentralized algorithm achieving exact convergence for distributed optimization on the Stiefel manifold.
We consider distributed non-convex optimization where a network of agents aims at minimizing a global function over the Stiefel manifold. The global function is represented as a finite sum of smooth local functions, where each local function is associated with one agent and agents communicate with each other over an undirected connected graph. The problem is non-convex as local functions are possibly non-convex (but smooth) and the Steifel manifold is a non-convex set. We present a decentralized Riemannian stochastic gradient method (DRSGD) with the convergence rate of O(1/root K) to a stationary point. To have exact convergence with constant stepsize, we also propose a decentralized Riemannian gradient tracking algorithm (DRGTA) with the convergence rate of O(1/K) to a stationary point. We use multi-step consensus to preserve the iteration in the local consensus region. DRGTA is the first decentralized algorithm with exact convergence for distributed optimization on Stiefel manifold.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available