4.8 Article

Generalization in quantum machine learning from few training data

Journal

NATURE COMMUNICATIONS
Volume 13, Issue 1, Pages -

Publisher

NATURE PORTFOLIO
DOI: 10.1038/s41467-022-32550-3

Keywords

-

Funding

  1. TopMath Graduate Center of the TUM Graduate School at the Technical University of Munich, Germany
  2. TopMath Program at the Elite Network of Bavaria
  3. German Academic Scholarship Foundation (Studienstiftung des deutschen Volkes)
  4. J. Yang & Family Foundation
  5. LANL LDRD program [20200022DR]
  6. Center for Nonlinear Studies at LANL
  7. Department of Defense
  8. LANL ASC Beyond Moore's Law project
  9. U.S. Department of Energy (DOE), Office of Science, Office of Advanced Scientific Computing Research
  10. U.S. Department of Energy, Office of Science, National Quantum Information Science Research Centers, Quantum Science Center
  11. U.S. Department of Energy National Nuclear Security Administration [89233218CNA000001]

Ask authors/readers for more resources

This study provides a comprehensive investigation into the generalization performance of QML on a limited training data set. The results show the relationship between the generalization error of a quantum machine learning model and the number of trainable gates, as well as the potential applications of quantum convolutional neural networks and learning quantum error correcting codes or quantum dynamical simulation.
Modern quantum machine learning (QML) methods involve variationally optimizing a parameterized quantum circuit on a training data set, and subsequently making predictions on a testing data set (i.e., generalizing). In this work, we provide a comprehensive study of generalization performance in QML after training on a limited number N of training data points. We show that the generalization error of a quantum machine learning model with T trainable gates scales at worst as root T/N. When only K << T gates have undergone substantial change in the optimization process, we prove that the generalization error improves to root K/N. Our results imply that the compiling of unitaries into a polynomial number of native gates, a crucial application for the quantum computing industry that typically uses exponential-size training data, can be sped up significantly. We also show that classification of quantum states across a phase transition with a quantum convolutional neural network requires only a very small training data set. Other potential applications include learning quantum error correcting codes or quantum dynamical simulation. Our work injects new hope into the field of QML, as good generalization is guaranteed from few training data.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available