Journal
ENTROPY
Volume 21, Issue 5, Pages -Publisher
MDPI
DOI: 10.3390/e21050485
Keywords
Jensen-Shannon divergence; Jeffreys divergence; resistor average distance; Bhattacharyya distance; f-divergence; Jensen; Burbea-Rao divergence; Bregman divergence; abstract weighted mean; quasi-arithmetic mean; mixture family; statistical M-mixture; exponential family; Gaussian family; Cauchy scale family; clustering
Categories
Ask authors/readers for more resources
The Jensen-Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler divergence to the average mixture distribution. However, the Jensen-Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present a generalization of the Jensen-Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using parameter mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen-Shannon divergence between probability densities of the same exponential family; and (ii) the geometric JS-symmetrization of the reverse Kullback-Leibler divergence between probability densities of the same exponential family. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen-Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen-Shannon divergences are touched upon.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available