4.5 Article

Layer-Skipping Connections Improve the Effectiveness of Equilibrium Propagation on Layered Networks

期刊

出版社

FRONTIERS MEDIA SA
DOI: 10.3389/fncom.2021.627357

关键词

equilibrium propagation; deep learning; small-world; layer-skipping connections; neuromorphic computing; biologically-motivated

向作者/读者索取更多资源

The paper proposes a method to alleviate the vanishing gradient problem by replacing some connections in a layered network, inspired by small-world networks. This approach is convenient to implement in neuromorphic hardware and is biologically plausible.
Equilibrium propagation is a learning framework that marks a step forward in the search for a biologically-plausible implementation of deep learning, and could be implemented efficiently in neuromorphic hardware. Previous applications of this framework to layered networks encountered a vanishing gradient problem that has not yet been solved in a simple, biologically-plausible way. In this paper, we demonstrate that the vanishing gradient problem can be mitigated by replacing some of a layered network's connections with random layer-skipping connections in a manner inspired by small-world networks. This approach would be convenient to implement in neuromorphic hardware, and is biologically-plausible.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据