4.7 Article

Virtual Distillation for Quantum Error Mitigation

期刊

PHYSICAL REVIEW X
卷 11, 期 4, 页码 -

出版社

AMER PHYSICAL SOC
DOI: 10.1103/PhysRevX.11.041036

关键词

-

资金

  1. NSF QLCI program [OMA2016245]

向作者/读者索取更多资源

Contemporary quantum computers suffer from high levels of noise, hindering useful calculations. A proposed technique called virtual distillation can reduce errors by entangling and measuring multiple noisy states, improving accuracy without explicitly preparing the state. This approach shows significant error suppression, particularly as system size increases, and enhances the convergence of quantum algorithms even in noise-free environments.
Contemporary quantum computers have relatively high levels of noise, making it difficult to use them to perform useful calculations, even with a large number of qubits. Quantum error correction is expected to eventually enable fault-tolerant quantum computation at large scales, but until then, it will be necessary to use alternative strategies to mitigate the impact of errors. We propose a near-term friendly strategy to mitigate errors by entangling and measuring M copies of a noisy state rho. This enables us to estimate expectation values with respect to a state with dramatically reduced error rho M/Tro rho M thorn without explicitly preparing it, hence the name virtual distillation. As M increases, this state approaches the closest pure state to rho exponentially quickly. We analyze the effectiveness of virtual distillation and find that it is governed in many regimes by the behavior of this pure state (corresponding to the dominant eigenvector of rho). We numerically demonstrate that virtual distillation is capable of suppressing errors by multiple orders of magnitude and explain how this effect is enhanced as the system size grows. Finally, we show that this technique can improve the convergence of randomized quantum algorithms, even in the absence of device noise.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据