4.6 Article

Minimum and Maximum Entropy Distributions for Binary Systems with Known Means and Pairwise Correlations

期刊

ENTROPY
卷 19, 期 8, 页码 -

出版社

MDPI
DOI: 10.3390/e19080427

关键词

information theory; minimum entropy; maximum entropy; statistical mechanics; Ising model; pairwise correlations; compressed sensing; neural networks

资金

  1. Hellman Foundation
  2. McDonnell Foundation
  3. McKnight Foundation
  4. Mary Elizabeth Rennie Endowment for Epilepsy Research
  5. National Science Foundation [IIS-1219199]
  6. NSF All-Institutes Postdoctoral Fellowship
  7. Mathematical Sciences Research Institute through its core grant [DMS-0441170]
  8. U.S. Army Research Laboratory
  9. U.S. Army Research Office [W911NF-13-1-0390]

向作者/读者索取更多资源

Maximum entropy models are increasingly being used to describe the collective activity of neural populations with measured mean neural activities and pairwise correlations, but the full space of probability distributions consistent with these constraints has not been explored. We provide upper and lower bounds on the entropy for the minimum entropy distribution over arbitrarily large collections of binary units with any fixed set of mean values and pairwise correlations. We also construct specific low-entropy distributions for several relevant cases. Surprisingly, the minimum entropy solution has entropy scaling logarithmically with system size for any set of first- and second-order statistics consistent with arbitrarily large systems. We further demonstrate that some sets of these low-order statistics can only be realized by small systems. Our results show how only small amounts of randomness are needed to mimic low-order statistical properties of highly entropic distributions, and we discuss some applications for engineered and biological information transmission systems.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据