Journal
NEURAL COMPUTATION
Volume 31, Issue 2, Pages 417-439Publisher
MIT PRESS
DOI: 10.1162/neco_a_01157
Keywords
-
Funding
- European Research Council (ERC) [679515]
- Fonds de la Recherche Scientifique
- Fonds Wetenschappelijk Onderzoek-Vlaanderen under EOS Project [O005318F-RG47]
Ask authors/readers for more resources
We propose a general framework to accelerate significantly the algorithms for nonnegative matrix factorization (NMF). This framework is inspired from the extrapolation scheme used to accelerate gradient methods in convex optimization and from the method of parallel tangents. However, the use of extrapolation in the context of the exact coordinate descent algorithms tackling the nonconvex NMF problems is novel. We illustrate the performance of this approach on two state-of-the-art NMF algorithms: accelerated hierarchical alternating least squares and alternating nonnegative least squares, using synthetic, image, and document data sets.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available