4.7 Article

Bilinear Generalized Approximate Message Passing-Part II: Applications

Journal

IEEE TRANSACTIONS ON SIGNAL PROCESSING
Volume 62, Issue 22, Pages 5854-5867

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSP.2014.2357773

Keywords

Approximate message passing; belief propagation; bilinear estimation; matrix completion; dictionary learning robust principal components analysis; matrix factorization

Funding

  1. AFOSR Lab [11RY02COR]
  2. NSF [IIP-0968910, CCF-1018368, CCF-1218754]
  3. DARPA/ONR [N66001-10-1-4090]
  4. European Commission [MIRG-268398]
  5. ERC Future Proof
  6. SNF [200021-132548, 200021-146750, CRSII2-147633]
  7. Division of Computing and Communication Foundations
  8. Direct For Computer & Info Scie & Enginr [1218754] Funding Source: National Science Foundation

Ask authors/readers for more resources

In this paper, we extend the generalized approximate message passing (G-AMP) approach, originally proposed for high-dimensional generalized-linear regression in the context of compressive sensing, to the generalized-bilinear case. In Part I of this two-part paper, we derived our Bilinear G-AMP (BiG-AMP) algorithm as an approximation of the sum-product belief propagation algorithm in the high-dimensional limit, and proposed an adaptive damping mechanism that aids convergence under finite problem sizes, an expectation-maximization (EM)-based method to automatically tune the parameters of the assumed priors, and two rank-selection strategies. Here, in Part II, we discuss the specializations of BiG-AMP to the problems of matrix completion, robust PCA, and dictionary learning, and present the results of an extensive empirical study comparing BiG-AMP to state-of-the-art algorithms on each problem. Our numerical results, using both synthetic and real-world datasets, demonstrate that EM-BiG-AMP yields excellent reconstruction accuracy (often best in class) while maintaining competitive runtimes.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available