4.6 Article

Finding Low-Rank Solutions via Nonconvex Matrix Factorization, Efficiently and Provably

期刊

SIAM JOURNAL ON IMAGING SCIENCES
卷 11, 期 4, 页码 2165-2204

出版社

SIAM PUBLICATIONS
DOI: 10.1137/17M1150189

关键词

nonconvex optimization; matrix factorization; low-rank minimization

向作者/读者索取更多资源

A rank-r matrix X is an element of R-mxn can be written as a product UVT, where U is an element of R(mxr )and V is an element of R-nxr. One could exploit this observation in optimization: e.g., consider the minimization of a convex function f(X) over rank-r matrices, where the set of low-rank matrices is modeled via UVT . Though such parameterization reduces the number of variables and is more computationally efficient (of particular interest is the case r << min{m, n}), it comes at a cost: f(UVT) becomes a nonconvex function w.r.t. U and V. We study such parameterization on generic convex objectives f and focus on firstorder, gradient descent algorithms. We propose the bifactored gradient descent (BFGD) algorithm, an efficient first-order method that operates directly on the U, V factors. We show that when f is (restricted) smooth, BFGD has local sublinear convergence; when f is both (restricted) smooth and (restricted) strongly convex, it has local linear convergence. For several applications, we provide simple and efficient initialization schemes that provide initial conditions, good enough for the above convergence results to hold, globally. Extensive experimental results support our arguments that BFGD is an efficient and accurate nonconvex method, compared to state-of-the-art approaches.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据