4.7 Article

Tensor Decomposition for Signal Processing and Machine Learning

Journal

IEEE TRANSACTIONS ON SIGNAL PROCESSING
Volume 65, Issue 13, Pages 3551-3582

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSP.2017.2690524

Keywords

Tensor decomposition; tensor factorization; rank; canonical polyadic decomposition (CPD); parallel factor analysis (PARAFAC); Tucker model; higher-order singular value decomposition (HOSVD); multilinear singular value decomposition (MLSVD); uniqueness; NP-hard problems; alternating optimization; alternating direction method of multipliers; gradient descent; Gauss-Newton; stochastic gradient; Cramer-Rao bound; communications; source separation; harmonic retrieval; speech separation; collaborative filtering; mixture modeling; topic modeling; classification; subspace learning

Funding

  1. National Science Foundation [IIS-1247632, IIS-1447788, IIS-1247489]
  2. KU Leuven Research Council [CoE EF/05/006, C16/15/059-nD]
  3. F.W.O. [G.0830.14N, G.0881.14N]
  4. Belgian Federal Science Policy Office [20122017]
  5. EU through the European Research Council [339804]
  6. Direct For Computer & Info Scie & Enginr
  7. Div Of Information & Intelligent Systems [1247632, 1447788] Funding Source: National Science Foundation

Ask authors/readers for more resources

Tensors or multiway arrays are functions of three or more indices (i, j, k,...)-similar to matrices (two-way arrays), which are functions of two indices (r, c) for (row, column). Tensors have a rich history, stretching over almost a century, and touching upon numerous disciplines; but they have only recently become ubiquitous in signal and data analytics at the confluence of signal processing, statistics, data mining, and machine learning. This overview article aims to provide a good starting point for researchers and practitioners interested in learning about and working with tensors. As such, it focuses on fundamentals and motivation (using various application examples), aiming to strike an appropriate balance of breadth and depth that will enable someone having taken first graduate courses in matrix algebra and probability to get started doing research and/or developing tensor algorithms and software. Some background in applied optimization is useful but not strictly required. The material covered includes tensor rank and rank decomposition; basic tensor factorization models and their relationships and properties (including fairly good coverage of identifiability); broad coverage of algorithms ranging from alternating optimization to stochastic gradient; statistical performance analysis; and applications ranging from source separation to collaborative filtering, mixture and topic modeling, classification, and multilinear subspace learning.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available