4.6 Article

Information Measure in Terms of the Hazard Function and Its Estimate

Journal

ENTROPY
Volume 23, Issue 3, Pages -

Publisher

MDPI
DOI: 10.3390/e23030298

Keywords

censoring variable; entropy; fisher information; Kullback-Leibler information; order statistics

Funding

  1. Basic Science Research Program through the National Research Foundation of Korea(NRF) - Ministry of Education [2018R1D1A1B07042581]
  2. National Research Foundation of Korea [2018R1D1A1B07042581] Funding Source: Korea Institute of Science & Technology Information (KISTI), National Science & Technology Information Service (NTIS)

Ask authors/readers for more resources

This paper explores representations of various information measures using hazard functions and reverse hazard functions. Additionally, estimators for quantal KL information, such as the Anderson-Darling test statistic, are provided and their performances are compared.
It is well-known that some information measures, including Fisher information and entropy, can be represented in terms of the hazard function. In this paper, we provide the representations of more information measures, including quantal Fisher information and quantal Kullback-leibler information, in terms of the hazard function and reverse hazard function. We provide some estimators of the quantal KL information, which include the Anderson-Darling test statistic, and compare their performances.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available