4.5 Article Proceedings Paper

The Kullback-Leibler divergence rate between Markov sources

期刊

IEEE TRANSACTIONS ON INFORMATION THEORY
卷 50, 期 5, 页码 917-921

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2004.826687

关键词

classifcation; decision theory; Kullback-Leibler divergence rate; nonnegative matrices; pattern recognition; Perron-Frobenius theory; rate of convergence; Shannon entropy rate; time-invariant Markov sources

向作者/读者索取更多资源

In this work, we provide a computable expression for the Kullback-Leibler divergence rate lim(n-->infinity)D(p((n))\\q((n))) between two time-invariant finite-alphabet Markov sources of arbitrary order and arbitrary initial distributions described by the probability distributions P-(n) and q((n)), respectively. We illustrate it numerically and examine its rate of convergence. The main tools used to obtain the Kullback-Leibler divergence rate and its rate of convergence are the theory of nonnegative matrices and Perron-Frobenius theory. Similarly, we provide a formula for the Shannon entropy rate lim(n-->infinity)1/n H(p((n))) of Markov sources and n examine its rate of convergence.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据