4.6 Article

Graph regularized and sparse nonnegative matrix factorization with hard constraints for data representation

Journal

NEUROCOMPUTING
Volume 173, Issue -, Pages 233-244

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2015.01.103

Keywords

Nonnegative matrix factorization; Graph-based regularizer; Sparseness constraints; Label information

Funding

  1. National Natural Science Foundation of China [61572244]

Ask authors/readers for more resources

Nonnegative Matrix Factorization (NMF) as a popular technique for finding parts-based, linear representations of nonnegative data has been successfully applied in a wide range of applications. This is because it can provide components with physical meaning and interpretations, which is consistent with the psychological intuition of combining parts to form whole. For practical classification tasks, NMF ignores both the local geometry of data and the discriminative information of different classes. In addition, existing research results demonstrate that leveraging sparseness can greatly enhance the ability of the learning parts. Motivated by these advances aforementioned, we propose a novel matrix decomposition algorithm, called Graph regularized and Sparse Non-negative Matrix Factorization with hard Constraints (GSNMFC). It attempts to find a compact representation of the data so that further learning tasks can be facilitated. The proposed GSNMFC jointly incorporates a graph regularizer and hard prior label information as well as sparseness constraint as additional conditions to uncover the intrinsic geometrical and discriminative structures of the data space. The corresponding update solutions and the convergence proofs for the optimization problem are also given in detail. Experimental results demonstrate the effectiveness of our algorithm in comparison to the state-of-the-art approaches through a set of evaluations. (C) 2015 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available