4.6 Article

THE ALTERNATING LINEAR SCHEME FOR TENSOR OPTIMIZATION IN THE TENSOR TRAIN FORMAT

Journal

SIAM JOURNAL ON SCIENTIFIC COMPUTING
Volume 34, Issue 2, Pages A683-A713

Publisher

SIAM PUBLICATIONS
DOI: 10.1137/100818893

Keywords

density matrix renormalization group; alternating least squares; optimization problem; iterative methods for linear systems; eigenvalue problem; tensor product approximation; tensor decompositions; tensor train; high-dimensional systems; matrix product states; hierarchical tensors

Funding

  1. DFG [SPP 1324]

Ask authors/readers for more resources

Recent achievements in the field of tensor product approximation provide promising new formats for the representation of tensors in form of tree tensor networks. In contrast to the canonical r-term representation (CANDECOMP, PARAFAC), these new formats provide stable representations, while the amount of required data is only slightly larger. The tensor train (TT) format [SIAM J. Sci. Comput., 33 (2011), pp. 2295-2317], a simple special case of the hierarchical Tucker format [J. Fourier Anal. Appl., 5 (2009), p. 706], is a useful prototype for practical low-rank tensor representation. In this article, we show how optimization tasks can be treated in the TT format by a generalization of the well-known alternating least squares (ALS) algorithm and by a modified approach (MALS) that enables dynamical rank adaptation. A formulation of the component equations in terms of so-called retraction operators helps to show that many structural properties of the original problems transfer to the micro-iterations, giving what is to our knowledge the first stable generic algorithm for the treatment of optimization tasks in the tensor format. For the examples of linear equations and eigenvalue equations, we derive concrete working equations for the micro-iteration steps; numerical examples confirm the theoretical results concerning the stability of the TT decomposition and of ALS and MALS but also show that in some cases, high TT ranks are required during the iterative approximation of low-rank tensors, showing some potential of improvement.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available