4.5 Article

A Quick and Easy Way to Estimate Entropy and Mutual Information for Neuroscience

期刊

FRONTIERS IN NEUROINFORMATICS
卷 15, 期 -, 页码 -

出版社

FRONTIERS MEDIA SA
DOI: 10.3389/fninf.2021.596443

关键词

entropy; mutual information; portable network graphic image; DEFLATE compression; rastergram; lossless (image) compression; place field

资金

  1. Welcome Trust Principal Fellowship [212251/Z/18/Z]
  2. ERC Proof of Principle grant [767372]
  3. LABEX CORTEX of Universite de Lyon [ANR11-LABX-0042]
  4. Wellcome Trust [212251/Z/18/Z] Funding Source: Wellcome Trust
  5. European Research Council (ERC) [767372] Funding Source: European Research Council (ERC)

向作者/读者索取更多资源

Calculations of entropy and mutual information in neuroscience are valuable analytical tools, but prone to sampling bias due to limited data. This article proposes the use of entropy-encoding compression algorithms for simple and unbiased estimation of entropy and mutual information levels. While not providing absolute values, this method is mathematically correct and can be applied in various experimental conditions.
Calculations of entropy of a signal or mutual information between two variables are valuable analytical tools in the field of neuroscience. They can be applied to all types of data, capture non-linear interactions and are model independent. Yet the limited size and number of recordings one can collect in a series of experiments makes their calculation highly prone to sampling bias. Mathematical methods to overcome this so-called sampling disaster exist, but require significant expertise, great time and computational costs. As such, there is a need for a simple, unbiased and computationally efficient tool for estimating the level of entropy and mutual information. In this article, we propose that application of entropy-encoding compression algorithms widely used in text and image compression fulfill these requirements. By simply saving the signal in PNG picture format and measuring the size of the file on the hard drive, we can estimate entropy changes through different conditions. Furthermore, with some simple modifications of the PNG file, we can also estimate the evolution of mutual information between a stimulus and the observed responses through different conditions. We first demonstrate the applicability of this method using white-noise-like signals. Then, while this method can be used in all kind of experimental conditions, we provide examples of its application in patch-clamp recordings, detection of place cells and histological data. Although this method does not give an absolute value of entropy or mutual information, it is mathematically correct, and its simplicity and broad use make it a powerful tool for their estimation through experiments.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据