4.6 Article

Neural tensor contractions and the expressive power of deep neural quantum states

期刊

PHYSICAL REVIEW B
卷 106, 期 20, 页码 -

出版社

AMER PHYSICAL SOC
DOI: 10.1103/PhysRevB.106.205136

关键词

-

向作者/读者索取更多资源

We establish a direct connection between general tensor networks and deep feed-forward artificial neural networks, allowing efficient tensor contractions and use of nonlinear activation functions. The resulting deep networks closely match the contraction complexity of the tensor networks to be approximated. Specifically in the context of many-body quantum states, our results show that neural-network states have strictly the same or higher expressive power than variational tensor networks. We also demonstrate that all matrix product states can be efficiently expressed as neural-network states with polynomial number of edges and logarithmic depth.
We establish a direct connection between general tensor networks and deep feed-forward artificial neural networks. The core of our results is the construction of neural-network layers that efficiently perform tensor contractions and that use commonly adopted nonlinear activation functions. The resulting deep networks feature a number of edges that closely match the contraction complexity of the tensor networks to be approximated. In the context of many-body quantum states, this result establishes that neural-network states have strictly the same or higher expressive power than practically usable variational tensor networks. As an example, we show that all matrix product states can be efficiently written as neural-network states with a number of edges polynomial in the bond dimension and depth that is logarithmic in the system size. The opposite instead does not hold true, and our results imply that there exist quantum states that are not efficiently expressible in terms of matrix product states or projected entangled pair states but that are instead efficiently expressible with neural network states.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据