The conventional computing system with the architecture of von Neumann has greatly benefited our humans for past decades, while it is also suffered from low efficiency due to the separation… Click to show full abstract
The conventional computing system with the architecture of von Neumann has greatly benefited our humans for past decades, while it is also suffered from low efficiency due to the separation between a memory block and a processing unit. Memristor, which is an emerging electron device with the capability of data storage and processing information simultaneously, can be employed to construct a bioinspired neuromorphic computing system. Simulation as one of the most powerful methods to obtain the optimizing result for the memristor-based neuromorphic network has been extensively focused to realize the high precision calculation. It becomes very difficult because the pulse-to-pulse (P2P) model is limited by the updating process. The memristor-based multi-layer Perceptron (MLP) network online training generally presents a low accuracy. Therefore, an efficiency training schedual is urgently desired to improve the accuracy. Based on the resistive switching behavior observed in the Ag/TiOx/F-doped SnO 2 memristor, the weight update by the P2P model enables the MLP network online training in the low accuracy memristor with high performance. The low bits MLP optimized by a novel weight update schedual can realize high precision identification and classification. By that, the time and power consumption of memristor can be largely reduced. The experiment result illustrates that the high accuracy of 90.82% and 95.44% can be obtained at the first and final epoch of the MNIST handwritten digital datasets, respectively. Importantly, the number of the weight update, and the online training time and power consumption can be reduced by 81% and 93.7%, respectively. The scheme provides high precision, low power consumption, and fast convergence solution for the in-situ training of the imprecise memristor-based neuromorphic network.
               
Click one of the above tabs to view related content.