4.2 Article

Jensen-information generating function and its connections to some well-known information measures

期刊

STATISTICS & PROBABILITY LETTERS
卷 170, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.spl.2020.108995

关键词

Information generating function; Shannon entropy; Jensen-Shannon entropy; Jensen-extropy; Kullback-Leibler divergence

向作者/读者索取更多资源

In this study, we investigate the information generating function measure and introduce two new divergence measures. It is shown that well-known information divergences such as Jensen-Shannon, Jensen-extropy, and Jensen-Taneja measures are special cases of this new approach. Additionally, the information generating function for residual lifetime variables is also discussed.
In this work, we consider the information generating function measure and develop some new results associated with it. We specifically propose two new divergence measures and show that some of the well-known information divergences such as Jensen-Shannon, Jensen-extropy and Jensen-Taneja divergence measures are all special cases of it. Finally, we also discuss the information generating function for residual lifetime variables. (C) 2020 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.2
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据