4.6 Article

Empirical Estimation of Information Measures: A Literature Guide

期刊

ENTROPY
卷 21, 期 8, 页码 -

出版社

MDPI
DOI: 10.3390/e21080720

关键词

information measures; empirical estimators; entropy; relative entropy; mutual information; universal estimation

资金

  1. US National Science Foundation [CCF-1016625]
  2. Center for Science of Information, an NSF Science and Technology Center [CCF-0939370]

向作者/读者索取更多资源

We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures. While those quantities are of central importance in information theory, universal algorithms for their estimation are increasingly important in data science, machine learning, biology, neuroscience, economics, language, and other experimental sciences.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据