4.6 Article

Robust tensor decomposition via t-SVD: Near-optimal statistical guarantee and scalable algorithms

期刊

SIGNAL PROCESSING
卷 167, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.sigpro.2019.107319

关键词

Tensor recovery; Tensor SVD; Low-rank recovery; Estimation error; ADMM

资金

  1. National Natural Science Foundation of China [61872188, U1713208, 61602244, 61672287, 61702262, 61773215, 61703209]

向作者/读者索取更多资源

Aiming at recovering a signal tensor from its mixture with outliers and noises, robust tensor decomposition (RTD) arises frequently in many real-world applications. Recently, the low-tubal-rank model has shown more powerful performances than traditional tensor low-rank models in several tensor recovery tasks. Assuming the underlying tensor to be low-tubal-rank and the outliers sparse, this paper first proposes a penalized least squares estimator for RTD. Specifically, we adopt the tubal nuclear norm (TNN) and a sparsity inducing norm to regularize the underlying tensor and the outliers, respectively. Then, from a statistical standpoint, non-asymptotic upper bounds on the estimation error are established and proved to be near-optimal in a minimax sense. Further, two algorithms, namely, an ADMM-based algorithm and a Frank-Wolfe (FW) based algorithm are proposed to efficiently solve the proposed estimator from a computational standpoint. The sharpness of the proposed upper bound is verified on synthetic datasets. The superiority and efficiency of the proposed algorithms is demonstrated in experiments on real datasets. (C) 2019 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据