4.6 Article

Liquid State Machine on SpiNNaker for Spatio-Temporal Classification Tasks

期刊

FRONTIERS IN NEUROSCIENCE
卷 16, 期 -, 页码 -

出版社

FRONTIERS MEDIA SA
DOI: 10.3389/fnins.2022.819063

关键词

Liquid State Machine; N-MNIST; neuromorphic hardware; spiking neural network; SpiNNaker

资金

  1. EU [871371]
  2. Ministry of Science and Innovation [PID2019-105556GB-C31]
  3. European Regional Development Fund)
  4. CONACYT [688116/578600]

向作者/读者索取更多资源

This work demonstrates that offline-trained Liquid State Machines (LSMs) implemented in the SpiNNaker neuromorphic processor can classify visual events, achieving state-of-the-art performance in the event-based N-MNIST dataset. The readout layer is trained using a variation of back-propagation-through-time (BPTT) for Spiking Neural Networks (SNNs), while the internal weights of the reservoir remain static. Results show that mapping the LSM from a Deep Learning framework to SpiNNaker does not affect the classification task performance. Additionally, weight quantization has a minimal impact on the performance of the LSM.
Liquid State Machines (LSMs) are computing reservoirs composed of recurrently connected Spiking Neural Networks which have attracted research interest for their modeling capacity of biological structures and as promising pattern recognition tools suitable for their implementation in neuromorphic processors, benefited from the modest use of computing resources in their training process. However, it has been difficult to optimize LSMs for solving complex tasks such as event-based computer vision and few implementations in large-scale neuromorphic processors have been attempted. In this work, we show that offline-trained LSMs implemented in the SpiNNaker neuromorphic processor are able to classify visual events, achieving state-of-the-art performance in the event-based N-MNIST dataset. The training of the readout layer is performed using a recent adaptation of back-propagation-through-time (BPTT) for SNNs, while the internal weights of the reservoir are kept static. Results show that mapping our LSM from a Deep Learning framework to SpiNNaker does not affect the performance of the classification task. Additionally, we show that weight quantization, which substantially reduces the memory footprint of the LSM, has a small impact on its performance.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据