4.7 Article

Accelerated and Inexact Soft-Impute for Large-Scale Matrix and Tensor Completion

期刊

出版社

IEEE COMPUTER SOC
DOI: 10.1109/TKDE.2018.2867533

关键词

Matrix completion; tensor completion; collaborative filtering; link prediction; proximal algorithms

资金

  1. Research Grants Council of the Hong Kong Special Administrative Region [614513]
  2. 4Paradigm Inc.

向作者/读者索取更多资源

Matrix and tensor completion aim to recover a low-rank matrix / tensor from limited observations and have been commonly used in applications such as recommender systems and multi-relational data mining. A state-of-the-art matrix completion algorithm is Soft-Impute, which exploits the special sparse plus low-rank structure of the matrix iterates to allow efficient SVD in each iteration. Though Soft-Impute is a proximal algorithm, it is generally believed that acceleration destroys the special structure and is thus not useful. In this paper, we show that Soft-Impute can indeed be accelerated without comprising this structure. To further reduce the iteration time complexity, we propose an approximate singular value thresholding scheme based on the power method. Theoretical analysis shows that the proposed algorithm still enjoys the fast O(1/T-2) convergence rate of accelerated proximal algorithms. We also extend the proposed algorithm to tensor completion with the scaled latent nuclear norm regularizer. We show that a similar sparse plus low-rank structure also exists, leading to low iteration complexity and fast O(1/T-2) convergence rate. Besides, the proposed algorithm can be further extended to nonconvex low-rank regularizers, which have better empirical performance than the convex nuclear norm regularizer. Extensive experiments demonstrate that the proposed algorithm is much faster than Soft-Impute and other state-of-the-art matrix and tensor completion algorithms.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据