4.2 Article

Jensen-information generating function and its connections to some well-known information measures

Journal

STATISTICS & PROBABILITY LETTERS
Volume 170, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.spl.2020.108995

Keywords

Information generating function; Shannon entropy; Jensen-Shannon entropy; Jensen-extropy; Kullback-Leibler divergence

Ask authors/readers for more resources

In this study, we investigate the information generating function measure and introduce two new divergence measures. It is shown that well-known information divergences such as Jensen-Shannon, Jensen-extropy, and Jensen-Taneja measures are special cases of this new approach. Additionally, the information generating function for residual lifetime variables is also discussed.
In this work, we consider the information generating function measure and develop some new results associated with it. We specifically propose two new divergence measures and show that some of the well-known information divergences such as Jensen-Shannon, Jensen-extropy and Jensen-Taneja divergence measures are all special cases of it. Finally, we also discuss the information generating function for residual lifetime variables. (C) 2020 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available