4.6 Article

Sparse and silent coding in neural circuits

期刊

NEUROCOMPUTING
卷 79, 期 -, 页码 115-124

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2011.10.017

关键词

l(1)-Norm; Cross-entropy method; Sparse coding

资金

  1. EC NEST [043261]
  2. European Union
  3. European Social Fund
  4. European Regional Development Fund [TAMOP 4.2.1./B-09/KMR-2010-0003, KMOP-1.1.2-08/1-2008-0002]

向作者/读者索取更多资源

Sparse coding algorithms find a linear basis in which signals can be represented by a small number of non-zero coefficients. Such coding may play an important role in neural information processing and metabolically efficient natural solutions serve as an inspiration for algorithms employed in various areas of computer science. In particular, finding non-zero coefficients in overcomplete sparse coding is a computationally hard problem, for which different approximate solutions have been proposed. Methods that minimize the magnitude of the coefficients ('l(1)-norm') instead of minimizing the size of the active subset of features ('l(0)-norm') may find the optimal solutions, but they do not scale well with the problem size and use centralized algorithms. Iterative, greedy methods, on the other hand are fast, but require a priori knowledge of the number of non-zero features, often find suboptimal solutions and they converge to the final sparse form through a series of non-sparse representations. In this article we propose a neurally plausible algorithm which efficiently integrates an l(0)-norm based probabilistic sparse coding model with ideas inspired by novel iterative solutions. Furthermore, the resulting algorithm does not require an exactly defined sparseness level thus it is suitable for representing natural stimuli with a varying number of features. We demonstrate that our combined method can find optimal solutions in cases where other, l(1)-norm based algorithms already fail. (C) 2011 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据