4.6 Article

Divergence Measure of Belief Function and Its Application in Data Fusion

期刊

IEEE ACCESS
卷 7, 期 -, 页码 107465-107472

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2019.2932390

关键词

Kullback-Leibler divergence; Dempster-Shafer evidence theory; basic probability assignment; target recognition

资金

  1. National Natural Science Foundation of China [61573290, 61503237]

向作者/读者索取更多资源

Divergence measure is widely used in many applications. To efficiently deal with uncertainty in real applications, basic probability assignment (BPA) in Dempster-Shafer evidence theory, instead of probability distribution, is adopted. As a result, an open issue is that how to measure the divergence of BPA. In this paper, a new divergence measure of two BPAs is proposed. The proposed divergence measure is the generalization of Kullback-Leibler divergence since when the BPA is degenerated as probability distribution, the proposed belief divergence is equal toKullback-Leibler divergence. Furthermore, compared with existing belief divergence measure, the new method has a better performance under the situation with a great degree of uncertainty and ambiguity. Numerical examples are used to illustrate the efficiency of the proposed divergence measure. In addition, based on the proposed belief divergence measure, a combination model is proposed to address data fusion. Finally, an example in target recognition is shown to illustrate the advantage of the new belief divergence in handling not only extreme uncertainty, but also highly conflicting data.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据