Journal
PHYSICAL REVIEW A
Volume 106, Issue 6, Pages -Publisher
AMER PHYSICAL SOC
DOI: 10.1103/PhysRevA.106.062423
Keywords
-
Categories
Ask authors/readers for more resources
This paper compares classical Tensor Networks (TN) with TN-inspired quantum circuits in the context of machine learning. The study shows that classical TNs require larger dimensions and result in a flat loss landscape, making optimization challenging. By using quantitative metrics, the paper also demonstrates that classical TNs require more training samples compared to TN-inspired quantum circuits. Additionally, the study explores the possibility of hybrid classical-quantum TNs and presents different TN ansatzes.
Tensor networks (TN) are approximations of high-dimensional tensors designed to represent locally entangled quantum many-body systems efficiently. This paper provides a comprehensive comparison between classical TNs and TN-inspired quantum circuits in the context of machine learning on highly complex, simulated Large Hadron Collider data. We show that classical TNs require exponentially large bond dimensions and higher Hilbert-space mapping to perform comparably to their quantum counterparts. While such an expansion in the dimensionality allows better performance, we observe that, with increased dimensionality, classical TNs lead to a highly flat loss landscape, rendering the usage of gradient-based optimization methods highly challenging. Furthermore, by employing quantitative metrics, such as the Fisher information and effective dimensions, we show that classical TNs require a more extensive training sample to represent the data as efficiently as TN-inspired quantum circuits. We also engage with the idea of hybrid classical-quantum TNs and show possible architectures to employ a larger phase space from the data. We offer our results using three main TN Ansatze: Tree tensor networks, matrix product states, and multiscale entanglement renormalization Ansatze.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available