期刊
FRONTIERS IN NEUROSCIENCE
卷 14, 期 -, 页码 -出版社
FRONTIERS MEDIA SA
DOI: 10.3389/fnins.2020.00662
关键词
neuromorphic computing; spiking networks; loss function; synaptic operations; energy consumption; convolutional networks; CIFAR10; MNIST-DVS
资金
- H2020 ECSEL grant TEMPO [826655]
In the last few years, spiking neural networks (SNNs) have been demonstrated to perform on par with regular convolutional neural networks. Several works have proposed methods to convert a pre-trained CNN to a Spiking CNN without a significant sacrifice of performance. We demonstrate first that quantization-aware training of CNNs leads to better accuracy in SNNs. One of the benefits of converting CNNs to spiking CNNs is to leverage the sparse computation of SNNs and consequently perform equivalent computation at a lower energy consumption. Here we propose an optimization strategy to train efficient spiking networks with lower energy consumption, while maintaining similar accuracy levels. We demonstrate results on the MNIST-DVS and CIFAR-10 datasets.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据