Spiking neural networks (SNNs) are inspired from biological brains and have demonstrated great energy efficiency on hardware computing platforms. However, it is a challenge to implement an online training algorithm… Click to show full abstract
Spiking neural networks (SNNs) are inspired from biological brains and have demonstrated great energy efficiency on hardware computing platforms. However, it is a challenge to implement an online training algorithm on SNN hardware to adapt to the realistic cognitive applications. Besides, the accuracies of SNN adopted rate-based coding scheme are highly depend on the number of coding timesteps, resulting in the long data processing time and massive power consumption. In this brief, a multilayer online training SNN accelerator based on spike-timing-dependent plasticity based back-propagation (BP-STDP) algorithm is carried out. To speed up the spike processing, a weighted neuron model is proposed according to the characteristics of BP-STDP training algorithm. In addition, the training module can reuse the integrate and fire arithmetic unit circuits in hidden layer neuron module, which can significantly reduce the hardware overheads. The proposed accelerator is verified on the Xilinx Virtex-7 FPGA under 100Mhz clock frequency for the application of handwritten digit recognition. The proposed accelerator achieves a 95.3% recognition accuracy rate while consuming 0.27 ms for recognizing one image, and 1.22 ms for training one image. Compared with the software implementation, the proposed accelerator demonstrates $41.87\times $ training speedup and $56.96\times $ recognizing speedup.
               
Click one of the above tabs to view related content.