4.6 Article

On the performance of Fisher Information Measure and Shannon entropy estimators

期刊

出版社

ELSEVIER
DOI: 10.1016/j.physa.2017.04.184

关键词

Fisher Information Measure; Shannon entropy; Estimation

向作者/读者索取更多资源

The performance of two estimators of Fisher Information Measure (FIM) and Shannon entropy (SE), one based on the discretization of the FIM and SE formulae (discrete-based approach) and the other based on the kernel-based estimation of the probability density function (pdf) (kernel-based approach) is investigated. The two approaches are employed to estimate the FIM and SE of Gaussian processes (with different values of sigma and size N), whose theoretic FIM and SE depend on the standard deviation sigma. The FIM (SE) estimated by using the discrete-based approach is approximately constant with sigma, but decreases (increases) with the bin number L; in particular, the discrete-based approach furnishes a rather correct estimation of FIM (SE) for L alpha sigma. Furthermore, for small values of sigma, the larger the size N of the series, the smaller the mean relative error; while for large values of sigma, the larger the size N of the series, the larger the mean relative error. The FIM (SE) estimated by using the kernel-based approach is very close to the theoretic value for any sigma, and the mean relative error decreases with the increase of the length of the series. Comparing the results obtained using the discrete-based and kernel-based approaches, the estimates of FIM and SE by using the kernel-based approach are much closer to the theoretic values for any sigma and any N and have to be preferred to the discrete-based estimates. (C) 2017 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据