LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Compression of Convolutional Neural Networks With Divergent Representation of Filters.

Photo by demoya from unsplash

Convolutional neural networks (CNNs) have made remarkable achievements in many tasks. However, most of them are hardly applied to embedded systems directly because of the requirement of huge memory space… Click to show full abstract

Convolutional neural networks (CNNs) have made remarkable achievements in many tasks. However, most of them are hardly applied to embedded systems directly because of the requirement of huge memory space and computing power. In this article, we propose a pruning framework, namely, FiltDivNet, to accelerate and compress CNN models for their applicability to small or portable devices. The correlations among filters are taken into account and measured by the goodness of fit. On this basis, a hybrid-cluster pruning strategy is designed with dynamic pruning ratios for different clusters in CNN models. It aims at representing its filters in their diversity by removing redundant ones cluster by cluster. In addition, a new loss function with adaptive sparsity constraints is introduced for the retraining and fine-tuning in the FiltDivNet. Finally, some comparative experiments based on classical CNN models are carried out to demonstrate its effectiveness in compression performance and its adaptability with different CNN architectures.

Keywords: compression convolutional; neural networks; networks divergent; divergent representation; convolutional neural; cnn models

Journal Title: IEEE transactions on neural networks and learning systems
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.