4.7 Article

Online Sparsifying Transform Learning-Part II: Convergence Analysis

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JSTSP.2015.2407860

Keywords

Sparse representations; sparsifying transforms; convergence guarantees; online learning; big data; dictionary learning; machine learning

Funding

  1. National Science Foundation(NSF) [CCF-1018660, CCF-1320953]
  2. Direct For Computer & Info Scie & Enginr
  3. Division of Computing and Communication Foundations [1320953] Funding Source: National Science Foundation
  4. Division of Computing and Communication Foundations
  5. Direct For Computer & Info Scie & Enginr [1018660] Funding Source: National Science Foundation

Ask authors/readers for more resources

Sparsity-based techniques have been widely popular in signal processing applications such as compression, denoising, and compressed sensing. Recently, the learning of sparsifying transforms for data has received interest. The advantage of the transform model is that it enables cheap and exact computations. In Part I of this work, efficient methods for online learning of square sparsifying transforms were introduced and investigated (by numerical experiments). The online schemes process signals sequentially, and can be especially useful when dealing with big data, and for real-time, or limited latency signal processing applications. In this paper, we prove that although the associated optimization problems are non-convex, the online transform learning algorithms are guaranteed to converge to the set of stationary points of the learning problem. The guarantee relies on a few simple assumptions. In practice, the algorithms work well, as demonstrated by examples of applications to representing and denoising signals.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available