As an emerging in-memory element, memristor has been widely used in various neural network circuits to represent the weights and accelerate the calculation. However, the Transformer Network (TN), one of… Click to show full abstract
As an emerging in-memory element, memristor has been widely used in various neural network circuits to represent the weights and accelerate the calculation. However, the Transformer Network (TN), one of the most important models for machine vision and natural language processing in recent years, has not yet been full-circuit implemented using memristors due to the complex calculation process and data storage. In order to carry out the computation of the TN more efficiently, this work proposes a memristor-based full-circuit implementation of the TN capable of: 1) a memristor crossbar module to preserve the weights of the TN and perform the vector-matrix multiplications; 2) an analog signal memory module to store the analog signal directly in near-memory mode; 3) function circuit modules to achieve five transformations, namely Softmax, Layer Normalization, ReLU, Multiply-add and Residual; 4) a timing signal generation module to schedule operations of the circuit. The proposed TN circuit can complete all calculations directly based on the analog signal without using any analog-digital converter (ADC), digital-analog converter (DAC) and digital memory. In addition, character image recognition experiments are carried out in PSPICE to verify the functional correctness of the designed circuit. The corresponding signal retention rates of the analog memory, the performance of the whole circuit, and the non-idealities of the memristors are also analyzed. The results indicate that the circuit has advantages in terms of area overhead, energy efficiency and anti-noise.
               
Click one of the above tabs to view related content.