A novel spiking neuromorphic architecture is presented in this paper. The architecture is based on charge-trap transistors (CTTs) which are experimentally-verified compute-in-memory devices. The proposed low-power scalable architecture targets large… Click to show full abstract
A novel spiking neuromorphic architecture is presented in this paper. The architecture is based on charge-trap transistors (CTTs) which are experimentally-verified compute-in-memory devices. The proposed low-power scalable architecture targets large neural network applications, such as machine learning tasks and emulation of brain connectivity networks. Data within the proposed architecture is encoded using a number of spikes approach. The CTT-based synapses receive Gaussian spikes, the most energy-efficient waveform for communication, as inputs from other neurons, the spikes are multiplied by synaptic weights and accumulated. The neuron, designed using a leaky integrate and fire model, generates a similar spike at the output. The proposed architecture is compared to literature and exhibits superior parameters. The neuron (including the synaptic array) occupies an area of $178.25~\mu \text{m}^{2}$ , supporting 5.6k neurons and 560k synapses per mm2, as well as exhibits low energy per synaptic operation of 8 pJ. To validate the proposed architecture, a single neuron was designed and evaluated as a binary classifier for two numbers from the MNIST data set. The accuracy, recall, and precision of the hardware neuron for the binary classification task are, respectively, 99.2%, 99.5%, and 98.6% (similar to results from other reported works).
               
Click one of the above tabs to view related content.