4.7 Article

Exponential Graph Regularized Non-Negative Low-Rank Factorization for Robust Latent Representation

期刊

MATHEMATICS
卷 10, 期 22, 页码 -

出版社

MDPI
DOI: 10.3390/math10224314

关键词

non-negative matrix factorization; low rank; matrix exponential; graph embedding; image representation

资金

  1. Postgraduate Research & Practice Innovation Program of Jiang Su Province [KYCX22_2219]
  2. National Natural Science Foundation of China [62172229, 61876213]
  3. Natural Science Foundation of Jiangsu Province [BK20211295, BK20201397]
  4. Jiangsu Key Laboratory of Image and Video Understanding for Social Safety of Nanjing University of Science and Technology [J2021-4]
  5. Qing Lan Project of Jiangsu University [SRFP-2021-YB-25]
  6. Austrian Science Fund (FWF) [J2021] Funding Source: Austrian Science Fund (FWF)

向作者/读者索取更多资源

This study proposed an exponential graph regularization non-negative low-rank factorization algorithm to improve the performance and robustness of NMF. By applying low-rank constraint, non-negative factorization, and graph embedding with matrix exponentiation, it aims to learn undisturbed latent data representations while preserving the local structure of known samples.
Non-negative matrix factorization (NMF) is a fundamental theory that has received much attention and is widely used in image engineering, pattern recognition and other fields. However, the classical NMF has limitations such as only focusing on local information, sensitivity to noise and small sample size (SSS) problems. Therefore, how to develop the NMF to improve the performance and robustness of the algorithm is a worthy challenge. Based on the bottlenecks above, we propose an exponential graph regularization non-negative low-rank factorization algorithm (EGNLRF) combining sparseness, low rank and matrix exponential. Firstly, based on the assumption that the data is corroded, we decompose a given raw data item with a data error fitting noise matrix, applying a low-rank constraint to the denoising data. Then, we perform a non-negative factorization on the resulting low-rank matrix, from which we derive the low-dimensional representation of the original matrix. Finally, we use the low-dimensional representation for graph embedding to maintain the geometry between samples. The graph embedding terms are matrix exponentiated to cope with SSS problems and nearest neighbor sensitivity. The above three steps will be incorporated into a joint framework to validate and optimize each other; therefore, we can learn latent data representations that are undisturbed by noise and preserve the local structure of known samples. We conducted simulation experiments on different datasets and verified the effectiveness of the algorithm by comparing the proposed with the lasting ones related to NMF, low rank and graph embedding.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据