4.8 Article

Efficiency of Local Learning Rules in Threshold-Linear Associative Networks

Journal

PHYSICAL REVIEW LETTERS
Volume 126, Issue 1, Pages -

Publisher

AMER PHYSICAL SOC
DOI: 10.1103/PhysRevLett.126.018301

Keywords

-

Funding

  1. EU Marie Sklodowska-Curie Training Network [765549 M-Gate]
  2. Human Frontier Science Program [RGP0057/2016]
  3. Research Council of Norway (Centre for Neural Computation) [223262]
  4. Kavli Foundation
  5. Research Council of Norway (NORBRAIN1) [197467]

Ask authors/readers for more resources

Threshold linear unit associative networks with Hebbian learning can approach and even surpass the Gardner storage limit, largely through sparsification of retrieved patterns. Our results suggest that one-shot self-organized Hebbian learning can be as efficient as nonlocal learning rules like back propagation in achieving optimal capacity.
We derive the Gardner storage capacity for associative networks of threshold linear units, and show that with Hebbian learning they can operate closer to such Gardner bound than binary networks, and even surpass it. This is largely achieved through a sparsification of the retrieved patterns, which we analyze for theoretical and empirical distributions of activity. As reaching the optimal capacity via nonlocal learning rules like back propagation requires slow and neurally implausible training procedures, our results indicate that one-shot self-organized Hebbian learning can be just as efficient.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available