Journal
SIAM JOURNAL ON OPTIMIZATION
Volume 27, Issue 1, Pages 205-245Publisher
SIAM PUBLICATIONS
DOI: 10.1137/15M1021106
Keywords
convex optimization; large-scale problems; Newton's method; random projection; randomized algorithms; random matrices; self-concordant functions; interior point method
Categories
Funding
- Office of Naval Research MURI [N00014-11-1-0688]
- Air Force Office of Scientific Research [AFOSR-FA9550-14-1-0016]
- National Science Foundation [CIF-31712-23800, DMS-1107000]
- Microsoft Research Fellowship
- Direct For Mathematical & Physical Scien
- Division Of Mathematical Sciences [1612948] Funding Source: National Science Foundation
Ask authors/readers for more resources
We propose a randomized second-order method for optimization known as the Newton sketch: it is based on performing an approximate Newton step using a randomly projected Hessian. For self-concordant functions, we prove that the algorithm has superlinear convergence with exponentially high probability, with convergence and complexity guarantees that are independent of condition numbers and related problem-dependent quantities. Given a suitable initialization, similar guarantees also hold for strongly convex and smooth objectives without self-concordance. When implemented using randomized projections based on a subsampled Hadamard basis, the algorithm typically has substantially lower complexity than Newton's method. We also describe extensions of our methods to programs involving convex constraints that are equipped with self-concordant barriers. We discuss and illustrate applications to linear programs, quadratic programs with convex constraints, logistic regression, and other generalized linear models, as well as semidefinite programs.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available