4.6 Article

Liquid State Machine on SpiNNaker for Spatio-Temporal Classification Tasks

Journal

FRONTIERS IN NEUROSCIENCE
Volume 16, Issue -, Pages -

Publisher

FRONTIERS MEDIA SA
DOI: 10.3389/fnins.2022.819063

Keywords

Liquid State Machine; N-MNIST; neuromorphic hardware; spiking neural network; SpiNNaker

Categories

Funding

  1. EU [871371]
  2. Ministry of Science and Innovation [PID2019-105556GB-C31]
  3. European Regional Development Fund)
  4. CONACYT [688116/578600]

Ask authors/readers for more resources

This work demonstrates that offline-trained Liquid State Machines (LSMs) implemented in the SpiNNaker neuromorphic processor can classify visual events, achieving state-of-the-art performance in the event-based N-MNIST dataset. The readout layer is trained using a variation of back-propagation-through-time (BPTT) for Spiking Neural Networks (SNNs), while the internal weights of the reservoir remain static. Results show that mapping the LSM from a Deep Learning framework to SpiNNaker does not affect the classification task performance. Additionally, weight quantization has a minimal impact on the performance of the LSM.
Liquid State Machines (LSMs) are computing reservoirs composed of recurrently connected Spiking Neural Networks which have attracted research interest for their modeling capacity of biological structures and as promising pattern recognition tools suitable for their implementation in neuromorphic processors, benefited from the modest use of computing resources in their training process. However, it has been difficult to optimize LSMs for solving complex tasks such as event-based computer vision and few implementations in large-scale neuromorphic processors have been attempted. In this work, we show that offline-trained LSMs implemented in the SpiNNaker neuromorphic processor are able to classify visual events, achieving state-of-the-art performance in the event-based N-MNIST dataset. The training of the readout layer is performed using a recent adaptation of back-propagation-through-time (BPTT) for SNNs, while the internal weights of the reservoir are kept static. Results show that mapping our LSM from a Deep Learning framework to SpiNNaker does not affect the performance of the classification task. Additionally, we show that weight quantization, which substantially reduces the memory footprint of the LSM, has a small impact on its performance.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available