期刊
METRON-INTERNATIONAL JOURNAL OF STATISTICS
卷 76, 期 1, 页码 115-131出版社
SPRINGER-VERLAG ITALIA SRL
DOI: 10.1007/s40300-017-0119-x
关键词
Estimation; Generalized Estimating Equations; Gini mean difference; Generalized Pareto distribution; Measures of divergence
In the present paper, we define a new measure of divergence between two probability distribution functions F-1 and F-2 based on Jensen inequality and Gini mean difference. The proposed measure, which we call it Jensen-Gini measure of divergence (JG), is symmetric and its square root is a metric. We show that the JG can be represented as a mixture of Cramer's distance (CD) between the two distributions F-1 and F-2. A generalization of JG for measuring the overall difference between several probability distributions is also proposed. The proposed JG measure of divergence is applied to estimate the unknown parameters of a probability distribution. We consider a statistical model F (x; theta), where the parameter theta epsilon Theta is assumed to be unknown. Based on a random sample drawn from the distribution, we consider the JG between the distribution F (x; theta) and the empirical estimator of the distribution. Then, we estimate the parameter theta as a value in the parameter space Theta which minimizes the JG between the distribution F (x; theta) and its empirical estimator. We call this estimator as minimum Jensen-Gini estimator (MJGE) of the parameter. Several properties of MJGE are investigated. It is shown that the MJGE is in the class of generalized estimating equations. Asymptotic properties of MJGE such as consistency and normality are explored. Some simulation studies are performed to evaluate the performance of MJGE.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据