期刊
SIAM JOURNAL ON OPTIMIZATION
卷 27, 期 1, 页码 205-245出版社
SIAM PUBLICATIONS
DOI: 10.1137/15M1021106
关键词
convex optimization; large-scale problems; Newton's method; random projection; randomized algorithms; random matrices; self-concordant functions; interior point method
资金
- Office of Naval Research MURI [N00014-11-1-0688]
- Air Force Office of Scientific Research [AFOSR-FA9550-14-1-0016]
- National Science Foundation [CIF-31712-23800, DMS-1107000]
- Microsoft Research Fellowship
- Direct For Mathematical & Physical Scien
- Division Of Mathematical Sciences [1612948] Funding Source: National Science Foundation
We propose a randomized second-order method for optimization known as the Newton sketch: it is based on performing an approximate Newton step using a randomly projected Hessian. For self-concordant functions, we prove that the algorithm has superlinear convergence with exponentially high probability, with convergence and complexity guarantees that are independent of condition numbers and related problem-dependent quantities. Given a suitable initialization, similar guarantees also hold for strongly convex and smooth objectives without self-concordance. When implemented using randomized projections based on a subsampled Hadamard basis, the algorithm typically has substantially lower complexity than Newton's method. We also describe extensions of our methods to programs involving convex constraints that are equipped with self-concordant barriers. We discuss and illustrate applications to linear programs, quadratic programs with convex constraints, logistic regression, and other generalized linear models, as well as semidefinite programs.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据