4.6 Article Proceedings Paper

First-order methods almost always avoid strict saddle points

Journal

MATHEMATICAL PROGRAMMING
Volume 176, Issue 1-2, Pages 311-337

Publisher

SPRINGER HEIDELBERG
DOI: 10.1007/s10107-019-01374-3

Keywords

Gradient descent; Smooth optimization; Saddle points; Local minimum; Dynamical systems

Ask authors/readers for more resources

We establish that first-order methods avoid strict saddle points for almost all initializations. Our results apply to a wide variety of first-order methods, including (manifold) gradient descent, block coordinate descent, mirror descent and variants thereof. The connecting thread is that such algorithms can be studied from a dynamical systems perspective in which appropriate instantiations of the Stable Manifold Theorem allow for a global stability analysis. Thus, neither access to second-order derivative information nor randomness beyond initialization is necessary to provably avoid strict saddle points.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available