LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Neural network pruning based on channel attention mechanism

Photo from wikipedia

Network pruning facilitates the deployment of convolutional neural networks in resource-limited environments by reducing redundant parameters. However, most of the existing methods ignore the differences in the contributions of the… Click to show full abstract

Network pruning facilitates the deployment of convolutional neural networks in resource-limited environments by reducing redundant parameters. However, most of the existing methods ignore the differences in the contributions of the output feature maps. In response to the above, we propose a novel neural network pruning method based on the channel attention mechanism. In this paper, we firstly utilise the principal component analysis algorithm to reduce the influence of noisy data on feature maps. Then, we propose an improved Leaky-Squeeze-and-Excitation block to evaluate the contribution of each output feature map using the channel attention mechanism. Finally, we effectively remove lower contribution channels without reducing the model performance as much as possible. Extensive experimental results show that our proposed method achieves significant improvements over the state-of-the-art in terms of FLOPs and parameters reduction with similar accuracies. For example, with VGG-16-baseline, our proposed method reduces parameters by 83.3% and FLOPs by 66.3%, with only a loss of 0.13% in top-5 accuracy. Furthermore, it effectively balances pruning efficiency and prediction accuracy.

Keywords: network pruning; attention mechanism; channel attention

Journal Title: Connection Science
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.