Journal
IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS
Volume 42, Issue 8, Pages 2604-2617Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCAD.2022.3228896
Keywords
Circuit design; image classification; memristor; selective attention; sequence learning; spiking neural network (SNN); supervised algorithm
Ask authors/readers for more resources
In this article, a selective supervised algorithm inspired by the selective attention mechanism is proposed, and memristive neural circuits are designed based on this algorithm. The proposed algorithm shows excellent performance on sequence learning. Additionally, attention encoding circuits are designed to encode external stimuli into attention spikes. The memristive spiking neural network circuit can achieve high accuracy on the MNIST and Fashion-MNIST datasets after learning a small number of labeled samples, reducing manual annotation cost and improving supervised learning efficiency.
Spiking neural networks (SNNs) are biologically plausible and computationally powerful. The current computing systems based on the von Neumann architecture are almost the hardware basis for the implementation of SNNs. However, performance bottlenecks in computing speed, cost, and energy consumption hinder the hardware development of SNNs. Therefore, efficient non von Neumann hardware computing systems for SNNs remain to be explored. In this article, a selective supervised algorithm for spiking neurons (SNs) inspired by the selective attention mechanism is proposed, and a memristive SN circuit as well as a memristive SNN circuit based on the proposed algorithm are designed. The memristor realizes the learning and memory of the synaptic weight. The proposed algorithm includes a top-down (TD) selective supervision method and a bottom-up (BU) selective supervision method. Compared with other supervised algorithms, the proposed algorithm has excellent performance on sequence learning. Moreover, TD and BU attention encoding circuits are designed to provide the hardware foundation for encoding external stimuli into TD and BU attention spikes, respectively. The proposed memristive SNN circuit can perform classification on the MNIST dataset and the Fashion-MNIST dataset with superior accuracy after learning a small number of labeled samples, which greatly reduces the cost of manual annotation and improves the supervised learning efficiency of the memristive SNN circuit.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available