A Memristive Spiking Neural Network Circuit with Selective Supervised Attention Algorithm
View/ Open
Author
Deng, Zekun
Wang, Chunhua
Lin, Hairong
Sun, Yichuang
Attention
2299/26689
Abstract
Spiking neural networks (SNNs) are biologically plausible and computationally powerful. The current computing systems based on the von Neumann architecture are almost the hardware basis for the implementation of SNNs. However, performance bottlenecks in computing speed, cost, and energy consumption hinder the hardware development of SNNs. Therefore, efficient non von Neumann hardware computing systems for SNNs remain to be explored. In this article, a selective supervised algorithm for spiking neurons (SNs) inspired by the selective attention mechanism is proposed, and a memristive SN circuit as well as a memristive SNN circuit based on the proposed algorithm are designed. The memristor realizes the learning and memory of the synaptic weight. The proposed algorithm includes a top-down (TD) selective supervision method and a bottom-up (BU) selective supervision method. Compared with other supervised algorithms, the proposed algorithm has excellent performance on sequence learning. Moreover, TD and BU attention encoding circuits are designed to provide the hardware foundation for encoding external stimuli into TD and BU attention spikes, respectively. The proposed memristive SNN circuit can perform classification on the MNIST dataset and the Fashion-MNIST dataset with superior accuracy after learning a small number of labeled samples, which greatly reduces the cost of manual annotation and improves the supervised learning efficiency of the memristive SNN circuit.