We discuss an alternative to relative entropy as a measure of distance between mixed quantum states. The proposed quantity is an extension to the realm of quantum theory of the Jensen-Shannon divergence (JSD) between probability distributions. The JSD has several interesting properties. It arises in information theory and, unlike the Kullback-Leibler divergence, it is symmetric, always well-defined, and bounded. We show that the quantum JSD shares with the relative entropy most of the physically relevant properties, in particular those required for a good quantum distinguishability measure. We relate it to other known quantum distances and we suggest possible applications in the field of the quantum information theory.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据