4.7 Article

Towards Flexible Sparsity-Aware Modeling: Automatic Tensor Rank Learning Using the Generalized Hyperbolic Prior

Journal

IEEE TRANSACTIONS ON SIGNAL PROCESSING
Volume 70, Issue -, Pages 1834-1849

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSP.2022.3164200

Keywords

Tensors; Probabilistic logic; Data models; Mathematical models; Inference algorithms; Bayes methods; Programming; Automatic tensor rank learning; tensor CPD; generalized hyperbolic distribution; Bayesian learning; variational inference

Funding

  1. NSFC [62001309, 61671411, 61731018, U1709219]
  2. Science and Technology on Sonar Laboratory [6142109KF212204]
  3. National Key Research and Development Project [2017YFE0119300]
  4. Hong Kong Research Grant Council [17207018]

Ask authors/readers for more resources

The study investigates the learning of tensor rank and introduces Bayesian inference under the Gaussian-gamma prior as an effective strategy. However, it is found that this strategy does not perform well for high-rank tensors and/or low signal-to-noise ratios. To overcome this issue, a more advanced generalized hyperbolic prior is introduced and an algorithm based on variational inference is developed, resulting in significantly improved performance.
Tensor rank learning for canonical polyadic decomposition (CPD) has long been deemed as an essential yet challenging problem. In particular, since thetensor rank controls the complexity of the CPD model, its inaccurate learning would cause overfitting to noise or underfitting to the signal sources, and even destroy the interpretability of model parameters. However, the optimal determination of a tensor rank is known to be a non-deterministic polynomial-time hard (NP-hard) task. Rather than exhaustively searching for the best tensor rank via trial-and-error experiments, Bayesian inference under the Gaussian-gamma prior was introduced in the context of probabilistic CPD modeling, and it was shown to be an effective strategy for automatic tensor rank determination. This triggered flourishing research on other structured tensor CPDs with automatic tensor rank learning. On the other side of the coin, these research works also reveal that the Gaussian-gamma model does not perform well for high-rank tensors and/or low signal-to-noise ratios (SNRs). To overcome these drawbacks, in this paper, we introduce a more advanced generalized hyperbolic (GH) prior to the probabilistic CPD model, which not only includes the Gaussian-gamma model as a special case, but also is more flexible to adapt to different levels of sparsity. Based on this novel probabilistic model, an algorithm is developed under the framework of variational inference, where each update is obtained in a closed-form. Extensive numerical results, using synthetic data and real-world datasets, demonstrate the significantly improved performance of the proposed method in learning both low as well as high tensor ranks even for low SNR cases.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available