4.6 Article

Large gradients via correlation in random parameterized quantum circuits

期刊

QUANTUM SCIENCE AND TECHNOLOGY
卷 6, 期 2, 页码 -

出版社

IOP Publishing Ltd
DOI: 10.1088/2058-9565/abd891

关键词

variational quantum algorithms; quantum machine learning; quantum alternating operator ansatz; Grover's algorithm

资金

  1. LANL's Laboratory Directed Research and Development (LDRD) program
  2. LANL ASC Beyond Moore's Law project
  3. U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, under the Accelerated Research in Quantum Computing (ARQC) program

向作者/读者索取更多资源

This study demonstrates that reducing the dimensionality of parameter space by utilizing circuit modules containing spatially or temporally correlated gate layers can help avoid the vanishing gradient phenomenon. In the variational versions of Grover's algorithm, as the number of layers increases towards O(2n/2), the bounds on cost function variation suggest a transition from vanishing gradients to efficient trainability.
Scaling of variational quantum algorithms to large problem sizes requires efficient optimization of random parameterized quantum circuits. For such circuits with uncorrelated parameters, the presence of exponentially vanishing gradients in cost function landscapes is an obstacle to optimization by gradient descent methods. In this work, we prove that reducing the dimensionality of the parameter space by utilizing circuit modules containing spatially or temporally correlated gate layers can allow one to circumvent the vanishing gradient phenomenon. Examples are drawn from random separable circuits and asymptotically optimal variational versions of Grover's algorithm based on the quantum alternating operator ansatz. In the latter scenario, our bounds on cost function variation imply a transition between vanishing gradients and efficient trainability as the number of layers is increased toward O(2n/2)

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据