期刊
STATISTICS & PROBABILITY LETTERS
卷 170, 期 -, 页码 -出版社
ELSEVIER
DOI: 10.1016/j.spl.2020.108995
关键词
Information generating function; Shannon entropy; Jensen-Shannon entropy; Jensen-extropy; Kullback-Leibler divergence
In this study, we investigate the information generating function measure and introduce two new divergence measures. It is shown that well-known information divergences such as Jensen-Shannon, Jensen-extropy, and Jensen-Taneja measures are special cases of this new approach. Additionally, the information generating function for residual lifetime variables is also discussed.
In this work, we consider the information generating function measure and develop some new results associated with it. We specifically propose two new divergence measures and show that some of the well-known information divergences such as Jensen-Shannon, Jensen-extropy and Jensen-Taneja divergence measures are all special cases of it. Finally, we also discuss the information generating function for residual lifetime variables. (C) 2020 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据