LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

A Mutual Learning Framework for Pruned and Quantized Networks

Photo by hajjidirir from unsplash

Model compression is an important topic in deep learning research. It can be mainly divided into two directions: model pruning and model quantization. However, both methods will more or less… Click to show full abstract

Model compression is an important topic in deep learning research. It can be mainly divided into two directions: model pruning and model quantization. However, both methods will more or less affect the original accuracy of the model. In this paper, we propose a mutual learning framework for pruned and quantized networks. We regard the pruned network and the quantizated network as two sets of features that are not parallel. The purpose of our mutual learning framework is to better integrate the two sets of features and achieve complementary advantages, which we call it feature augmentation. To verify the effectiveness of our framework, we select a pairwise combination of 3 state-of-the-art pruning algorithms and 3 state-of-theart quantization algorithms. Extensive experiments on CIFAR-10, CIFAR-100 and Tiny-imagenet show the benefits of our framework: through the mutual learning of the two networks, we obtain a pruning network and a quantization network with higher accuracy at the same time.

Keywords: mutual learning; framework pruned; learning; pruned quantized; learning framework

Journal Title: Journal of Computer Science and Technology
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.