4.2 Article

Optimal guessing under nonextensive framework and associated moment bounds

Journal

STATISTICS & PROBABILITY LETTERS
Volume 197, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.spl.2023.109812

Keywords

Guessing strategy; Uncertain source; q-Normalized expectation; Logarithmic norm entropy; Relative(?; 3)-entropy; Logarithmic super divergence

Ask authors/readers for more resources

This paper discusses the problem of guessing a random variable based on Tsallis' non-extensive entropic framework. Both conditional and unconditional guessing problems are considered, and non-extensive moment bounds of the required number of guesses are derived using the logarithmic norm entropy measure. The relationship between these moment bounds and the relative (alpha, beta)-entropies is explored.
We consider the problem of guessing the realization of a random variable but un-der more general Tsallis' non-extensive entropic framework rather than the classi-cal Maxwell-Boltzman-Gibbs-Shannon framework. We consider both the conditional guessing problem in the presence of some related side information, and the uncondi-tional one where no such side-information is available. For both types of the problem, the non-extensive moment bounds of the required number of guesses are derived; here we use the q-normalized expectation in place of the usual (linear) expectation to define the non-extensive moments. These moment bounds are seen to be a function of the logarithmic norm entropy measure, a recently developed two-parameter generalization of the Renyi entropy, and hence provide their information theoretic interpretation. We have also considered the case of uncertain source distribution and derived the non-extensive moment bounds for the corresponding mismatched guessing function. These mismatched bounds are interestingly seen to be linked with an important robust statistical divergence family known as the relative (alpha, /3)-entropies; similar link is discussed between the optimum mismatched guessing with the extremes of these relative entropy measures.(c) 2023 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available