In this paper, we present a novel early termination based training acceleration technique for temporal coding based spiking neural network (SNN) processor design. The proposed early termination scheme can efficiently… Click to show full abstract
In this paper, we present a novel early termination based training acceleration technique for temporal coding based spiking neural network (SNN) processor design. The proposed early termination scheme can efficiently identify the non-contributing training images during the training's feedforward process, and it skips the rest of the processes to save training energy and time. A metric to evaluate each input image's contribution to training has been developed, and it is compared with pre-determined threshold to decide whether to skip the rest of the training process. For the threshold selection, an adaptive threshold calculation method is presented to increase the computation skip ratio without sacrificing accuracy. Timestep splitting approach is also employed to allow more frequent early termination in split timesteps, thus leading to more computation savings. The proposed early termination and timestep splitting techniques achieve 51.21/42.31/93.53/30.36% reduction of synaptic operations and 86.06/64.63/90.82/49.14% reduction of feedforward timestep for the training process on MNIST/Fashion-MNIST/ETH-80/EMNIST-Letters dataset, respectively. The hardware implementation of the proposed SNN processor using 28 nm CMOS process shows that the SNN processor achieves the training energy saving of 61.76/31.88% and computation cycle reduction of 69.10/36.26% on MNIST/Fashion-MNIST dataset, respectively.
               
Click one of the above tabs to view related content.