4.7 Article

Statistical performance of convex low-rank and sparse tensor recovery

Journal

PATTERN RECOGNITION
Volume 93, Issue -, Pages 193-203

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2019.03.014

Keywords

Tensor recovery; Statistical performance; Tucker rank; Tensor de-noising; Tensor compressive sensing

Funding

  1. National Basic Research Program of China [2014CB349303]
  2. National Natural Science Foundation of China [61872188]
  3. Major Program of the National Natural Science Foundation of China [2015ZX01041101]

Ask authors/readers for more resources

Low-rank or sparse tensor recovery finds many applications in computer vision and machine learning. The recently proposed regularized multilinear regression and selection (Remurs) model assumes the true tensor to be simultaneously low-Tucker-rank and sparse, and has been successfully applied in fMRI analysis. However, the statistical performance of Remurs-like models is still lacking. To address this problem, a minimization problem based on a newly defined tensor nuclear-l(1)-norm is proposed, to recover a simultaneously low-Tucker-rank and sparse tensor from its degraded observations. Then, an M-ADMM-based algorithm is developed to efficiently solve the problem. Further, the statistical performance is analyzed by establishing a deterministic upper bound on the estimation error for general noise. Also, under Gaussian noise, non-asymptotic upper bounds for two specific settings, i.e., noisy tensor decomposition and random Gaussian design, are given. Experiments on synthetic datasets demonstrate that the proposed theorems can precisely predict the scaling behavior of the estimation error. (C) 2019 Published by Elsevier Ltd.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available