Journal
COMPUTER SCIENCE REVIEW
Volume 42, Issue -, Pages -Publisher
ELSEVIER
DOI: 10.1016/j.cosrev.2021.100423
Keywords
Machine learning; Matrix factorizations; Deep learning; Data mining; Unsupervised learning
Categories
Funding
- European Research Council (ERC) [679515]
- Fonds de la Recherche Scientifique -FNRS, Belgium
- Fonds Wetenschappelijk Onderzoek, Belgium -Vlaanderen (FWO) [O005318F-RG47]
Ask authors/readers for more resources
Constrained low-rank matrix approximations are powerful linear dimensionality reduction techniques, but unable to mine complex, interleaved features. Deep matrix factorization has been introduced to extract multiple layers of features and has shown outstanding performances in unsupervised tasks. It is likely to become an important paradigm in unsupervised learning in the next few years.
Constrained low-rank matrix approximations have been known for decades as powerful linear dimensionality reduction techniques able to extract the information contained in large data sets in a relevant way. However, such low-rank approaches are unable to mine complex, interleaved features that underlie hierarchical semantics. Recently, deep matrix factorization (deep MF) was introduced to deal with the extraction of several layers of features and has been shown to reach outstanding performances on unsupervised tasks. Deep MF was motivated by the success of deep learning, as it is conceptually close to some neural networks paradigms. In this survey paper, we present the main models, algorithms, and applications of deep MF through a comprehensive literature review. We also discuss theoretical questions and perspectives of research as deep MF is likely to become an important paradigm in unsupervised learning in the next few years. (C) 2021 Elsevier Inc. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available