4.5 Article

Layer-Skipping Connections Improve the Effectiveness of Equilibrium Propagation on Layered Networks

Journal

FRONTIERS IN COMPUTATIONAL NEUROSCIENCE
Volume 15, Issue -, Pages -

Publisher

FRONTIERS MEDIA SA
DOI: 10.3389/fncom.2021.627357

Keywords

equilibrium propagation; deep learning; small-world; layer-skipping connections; neuromorphic computing; biologically-motivated

Ask authors/readers for more resources

The paper proposes a method to alleviate the vanishing gradient problem by replacing some connections in a layered network, inspired by small-world networks. This approach is convenient to implement in neuromorphic hardware and is biologically plausible.
Equilibrium propagation is a learning framework that marks a step forward in the search for a biologically-plausible implementation of deep learning, and could be implemented efficiently in neuromorphic hardware. Previous applications of this framework to layered networks encountered a vanishing gradient problem that has not yet been solved in a simple, biologically-plausible way. In this paper, we demonstrate that the vanishing gradient problem can be mitigated by replacing some of a layered network's connections with random layer-skipping connections in a manner inspired by small-world networks. This approach would be convenient to implement in neuromorphic hardware, and is biologically-plausible.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available