4.7 Review

A survey on deep matrix factorizations

Journal

COMPUTER SCIENCE REVIEW
Volume 42, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.cosrev.2021.100423

Keywords

Machine learning; Matrix factorizations; Deep learning; Data mining; Unsupervised learning

Funding

  1. European Research Council (ERC) [679515]
  2. Fonds de la Recherche Scientifique -FNRS, Belgium
  3. Fonds Wetenschappelijk Onderzoek, Belgium -Vlaanderen (FWO) [O005318F-RG47]

Ask authors/readers for more resources

Constrained low-rank matrix approximations are powerful linear dimensionality reduction techniques, but unable to mine complex, interleaved features. Deep matrix factorization has been introduced to extract multiple layers of features and has shown outstanding performances in unsupervised tasks. It is likely to become an important paradigm in unsupervised learning in the next few years.
Constrained low-rank matrix approximations have been known for decades as powerful linear dimensionality reduction techniques able to extract the information contained in large data sets in a relevant way. However, such low-rank approaches are unable to mine complex, interleaved features that underlie hierarchical semantics. Recently, deep matrix factorization (deep MF) was introduced to deal with the extraction of several layers of features and has been shown to reach outstanding performances on unsupervised tasks. Deep MF was motivated by the success of deep learning, as it is conceptually close to some neural networks paradigms. In this survey paper, we present the main models, algorithms, and applications of deep MF through a comprehensive literature review. We also discuss theoretical questions and perspectives of research as deep MF is likely to become an important paradigm in unsupervised learning in the next few years. (C) 2021 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available