3.8 Proceedings Paper

Biologically-inspired training of spiking recurrent neural networks with neuromorphic hardware

Publisher

IEEE
DOI: 10.1109/AICAS54282.2022.9869963

Keywords

online training; spiking neural networks; neuromorphic hardware; in-memory computing; phase-change memory

Funding

  1. IBM Research AI Hardware Center
  2. ERA-NET [CHIST-ERA-18-ACAI-004]
  3. SNSF [20CH21 186999/1]
  4. Swiss National Science Foundation (SNF) [20CH21_186999] Funding Source: Swiss National Science Foundation (SNF)

Ask authors/readers for more resources

Recurrent spiking neural networks (SNNs) are inspired by the working principles of biological nervous systems. However, the error backpropagation through time (BPTT) algorithm has limitations for online learning scenarios of SNNs. Alternative credit assignment schemes are required. Neuromorphic hardware (NMHW) implementations of SNNs can benefit from in-memory computing (IMC) concepts, enhancing energy efficiency.
Recurrent spiking neural networks (SNNs) are inspired by the working principles of biological nervous systems that offer unique temporal dynamics and event-based processing. Recently, the error backpropagation through time (BPTT) algorithm has been successfully employed to train SNNs offline, with comparable performance to artificial neural networks (ANNs) on complex tasks. However, BPTT has severe limitations for online learning scenarios of SNNs where the network is required to simultaneously process and learn from incoming data. Specifically, as BPTT separates the inference and update phases, it would require to store all neuronal states for calculating the weight updates backwards in time. To address these fundamental issues, alternative credit assignment schemes are required. Within this context, neuromorphic hardware (NMHW) implementations of SNNs can greatly benefit from in-memory computing (IMC) concepts that follow the brain-inspired collocation of memory and processing, further enhancing their energy efficiency. In this work, we utilize a biologically-inspired local and online training algorithm compatible with IMC, which approximates BPTT, e-prop, and present an approach to support both inference and training of a recurrent SNN using NMHW. To do so, we embed the SNN weights on an in-memory computing NMHW with phase-change memory (PCM) devices and integrate it into a hardware-in-the-loop training setup. We develop our approach with respect to limited precision and imperfections of the analog devices using a PCM-based simulation framework and a NMHW consisting of in-memory computing cores fabricated in 14nm CMOS technology with 256x256 PCM crossbar arrays. We demonstrate that our approach is robust even to 4-bit precision and achieves competitive performance to a floating-point 32-bit realization, while simultaneously equipping the SNN with online training capabilities and exploiting the acceleration benefits of NMHW.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available