期刊
PHYSICAL REVIEW LETTERS
卷 130, 期 15, 页码 -出版社
AMER PHYSICAL SOC
DOI: 10.1103/PhysRevLett.130.150601
关键词
-
Parametrized quantum circuits, or quantum neural networks, have the potential to outperform classical counterparts in addressing learning problems. However, the convergence rate of training quantum neural networks is not fully understood. In this study, we analyze the dynamics of gradient descent for a class of variational quantum machine learning models and derive a simple analytic formula to describe their loss function behavior.
Parametrized quantum circuits can be used as quantum neural networks and have the potential to outperform their classical counterparts when trained for addressing learning problems. To date, much of the results on their performance on practical problems are heuristic in nature. In particular, the convergence rate for the training of quantum neural networks is not fully understood. Here, we analyze the dynamics of gradient descent for the training error of a class of variational quantum machine learning models. We define wide quantum neural networks as parametrized quantum circuits in the limit of a large number of qubits and variational parameters. Then, we find a simple analytic formula that captures the average behavior of their loss function and discuss the consequences of our findings. For example, for random quantum circuits, we predict and characterize an exponential decay of the residual training error as a function of the parameters of the system. Finally, we validate our analytic results with numerical experiments.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据