期刊
FRONTIERS IN NEUROSCIENCE
卷 16, 期 -, 页码 -出版社
FRONTIERS MEDIA SA
DOI: 10.3389/fnins.2022.819063
关键词
Liquid State Machine; N-MNIST; neuromorphic hardware; spiking neural network; SpiNNaker
资金
- EU [871371]
- Ministry of Science and Innovation [PID2019-105556GB-C31]
- European Regional Development Fund)
- CONACYT [688116/578600]
This work demonstrates that offline-trained Liquid State Machines (LSMs) implemented in the SpiNNaker neuromorphic processor can classify visual events, achieving state-of-the-art performance in the event-based N-MNIST dataset. The readout layer is trained using a variation of back-propagation-through-time (BPTT) for Spiking Neural Networks (SNNs), while the internal weights of the reservoir remain static. Results show that mapping the LSM from a Deep Learning framework to SpiNNaker does not affect the classification task performance. Additionally, weight quantization has a minimal impact on the performance of the LSM.
Liquid State Machines (LSMs) are computing reservoirs composed of recurrently connected Spiking Neural Networks which have attracted research interest for their modeling capacity of biological structures and as promising pattern recognition tools suitable for their implementation in neuromorphic processors, benefited from the modest use of computing resources in their training process. However, it has been difficult to optimize LSMs for solving complex tasks such as event-based computer vision and few implementations in large-scale neuromorphic processors have been attempted. In this work, we show that offline-trained LSMs implemented in the SpiNNaker neuromorphic processor are able to classify visual events, achieving state-of-the-art performance in the event-based N-MNIST dataset. The training of the readout layer is performed using a recent adaptation of back-propagation-through-time (BPTT) for SNNs, while the internal weights of the reservoir are kept static. Results show that mapping our LSM from a Deep Learning framework to SpiNNaker does not affect the performance of the classification task. Additionally, we show that weight quantization, which substantially reduces the memory footprint of the LSM, has a small impact on its performance.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据