4.5 Article

Renyi Divergence and Kullback-Leibler Divergence

Journal

IEEE TRANSACTIONS ON INFORMATION THEORY
Volume 60, Issue 7, Pages 3797-3820

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2014.2320500

Keywords

alpha-divergence; Bhattacharyya distance; information divergence; Kullback-Leibler divergence; Pythagorean inequality; Renyi divergence

Funding

  1. Netherlands Organization for Scientific Research through Rubicon Programme [680-50-1112]

Ask authors/readers for more resources

Renyi divergence is related to Renyi entropy much like Kullback-Leibler divergence is related to Shannon's entropy, and comes up in many settings. It was introduced by Renyi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Renyi divergence of order 1 equals the Kullback-Leibler divergence. We review and extend the most important properties of Renyi divergence and Kullback-Leibler divergence, including convexity, continuity, limits of sigma-algebras, and the relation of the special order 0 to the Gaussian dichotomy and contiguity. We also show how to generalize the Pythagorean inequality to orders different from 1, and we extend the known equivalence between channel capacity and minimax redundancy to continuous channel inputs (for all orders) and present several other minimax results.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available