Journal
NEUROCOMPUTING
Volume 273, Issue -, Pages 414-423Publisher
ELSEVIER
DOI: 10.1016/j.neucom.2017.07.041
Keywords
Dictionary learning; Structured noise; Low rank representation; Sparse representation
Categories
Funding
- National Key Basic Research Project of China (973 Program) [2015CB352303, 2015CB352502]
- National Nature Science Foundation (NSF) of China [61671027, 61625301, 61731018, 61231002]
- Microsoft Research Asia Collaborative Research Program
Ask authors/readers for more resources
Recently, lots of dictionary learning methods have been proposed and successfully applied. However, many of them assume that the noise in data is drawn from Gaussian or Laplacian distribution and therefore they typically adopt the l(2) or l(1) norm to characterize these two kinds of noise, respectively. Since this assumption is inconsistent with the real cases, the performance of these methods is limited. In this paper, we propose a novel dictionary learning with structured noise (DLSN) method for handling noisy data. We decompose the original data into three parts: clean data, structured noise, and Gaussian noise, and then characterize them separately. We utilize the low-rank technique to preserve the inherent sub-space structure of clean data. Instead of only using the predefined distribution to fit the real distribution of noise, we learn an adaptive dictionary to characterize structured noise and employ the l(2) norm to depict Gaussian noise. Such a mechanism can characterize noise more precisely. We also prove that our proposed optimization method can converge to a critical point and the convergence rate is at least sub-linear. Experimental results on the data clustering task demonstrate the effectiveness and robustness of our method. (C) 2017 Elsevier B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available