4.4 Article

Measuring entropy in continuous and digitally filtered neural signals

期刊

JOURNAL OF NEUROSCIENCE METHODS
卷 196, 期 1, 页码 81-87

出版社

ELSEVIER SCIENCE BV
DOI: 10.1016/j.jneumeth.2011.01.002

关键词

Information; Mechanoreceptor; Entropy; Compression; Sorting; Algorithm

资金

  1. Canadian Institutes of Health Research
  2. Nova Scotia Health Research Foundation

向作者/读者索取更多资源

Neurons receive, process and transmit information using two distinct types of signaling methods: analog signals, such as graded changes in membrane potential, and binary digital action potentials. Quantitative estimates of information in neural signals have been based either on information capacity, which measures the theoretical maximum information flow through a communication channel, or on entropy, the amount of information that is required to describe or reproduce a signal. Measurement of entropy is straightforward for digital signals, including action potentials, but is more difficult for analog signals. This problem compromises attempts to estimate information in many neural signals, particularly when there is conversion between the two signal formats. We extended an established method for action potential entropy estimation to provide entropy estimation of analog signals. Our approach is based on context-independent data compression of analog signals, which we call analog compression. Although compression of analog signals is computationally intensive, we describe an algorithm that provides practical, efficient and reliable entropy estimation via analog compression. Implementation of the algorithm is demonstrated at two stages of sensory processing by a mechanoreceptor. (C) 2011 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.4
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据