4.7 Article

Bidirectional Associative Memories: Unsupervised Hebbian Learning to Bidirectional Backpropagation

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TSMC.2020.3043249

Keywords

Bidirectional associative memory (BAM); bidirectional backpropagation; global stability; Hebbian learning

Ask authors/readers for more resources

Bidirectional associative memories (BAMs) pass neural signals forward and backward through the same network of synapses. These memories can adjust synaptic weights using unsupervised learning and can be extended to arbitrary hidden layers with proper bidirectional backpropagation algorithms.
Bidirectional associative memories (BAMs) pass neural signals forward and backward through the same web of synapses. Earlier BAMs had no hidden neurons and did not use supervised learning. They tuned their synaptic weights with unsupervised Hebbian or competitive learning. Two-layer feedback BAMs always converge to fixed-point equilibria for threshold or threshold-like neurons. Every rectangular connection matrix is bidirectionally stable. These simpler BAMs extend to arbitrary hidden layers with supervised learning if the resulting bidirectional backpropagation algorithm uses the proper layer likelihood in the forward and backward directions. Bidirectional backpropagation lets users run deep classifiers and regressors in reverse as well as forward. Bidirectional training exploits pattern and synaptic information that forward-only running ignores.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available