3.8 Article

Unsupervised Learning to Overcome Catastrophic Forgetting in Neural Networks

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JXCDC.2019.2911135

Keywords

Catastrophic forgetting; continual learning; convolutional neural network (CNN); neuromorphic engineering; phase-change memory (PCM); spike-timing-dependent plasticity (STDP); supervised learning; unsupervised learning

Funding

  1. European Research Council (ERC) through the European Union's Horizon 2020 Research and Innovation Programme [648635]
  2. [R164TYLBZP]

Ask authors/readers for more resources

Continual learning is the ability to acquire a new task or knowledge without losing any previously collected information. Achieving continual learning in artifial intelligence (AI) is currently prevented by catastrophic forgetting, where training of a new task deletes all previously learned tasks. Here, we present a new concept of a neural network capable of combining supervised convolutional learning with bio-inspired unsupervised learning. Brain-inspired concepts such as spike-timing-dependent plasticity (STDP) and neural redundancy are shown to enable continual learning and prevent catastrophic forgetting without compromising standard accuracy achievable with state-of-the-art neural networks. Unsupervised learning by STDP is demonstrated by hardware experiments with a one-layer perceptron adopting phase-hange memory (PCM) synapses. Finally, we demonstrate full testing classifcation of Modified National Institute of Standards and Technology (MNIST) database with an accuracy of 98% and continual learning of up to 30% non-trained classes with 83% average accuracy.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available