The softmax function is widely used in deep neural networks (DNNs), its hardware performance plays an important role in the training and inference of DNN accelerators. However, due to the… Click to show full abstract
The softmax function is widely used in deep neural networks (DNNs), its hardware performance plays an important role in the training and inference of DNN accelerators. However, due to the complexity of the traditional softmax, the existing hardware architectures are resource-consuming or have low precision. In order to address the challenges, we study a base-2 softmax function in terms of its suitability for neural network training and efficient hardware implementation. Compared to the classical base-
               
Click one of the above tabs to view related content.