LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Deep network compression based on partial least squares

Photo from archive.org

Abstract Modern visual pattern recognition methods are based on convolutional networks since they are able to learn complex patterns directly from the data. However, convolutional networks are computationally expensive in… Click to show full abstract

Abstract Modern visual pattern recognition methods are based on convolutional networks since they are able to learn complex patterns directly from the data. However, convolutional networks are computationally expensive in terms of floating point operations (FLOPs), energy consumption and memory requirements, which hinder their deployment on low-power and resource-constrained systems. To address this problem, many works have proposed pruning strategies, which remove neurons (i.e., filters) in convolutional networks to reduce their computational cost. Despite achieving remarkable results, existing pruning approaches are ineffective since the accuracy of the network is degraded. This loss in accuracy is an effect of the criterion used to remove filters, as it may result in the removal of the filters with high influence to the classification ability of the network. Motivated by this, we propose an approach that eliminates filters based on the relationship of their outputs with the class label, on a low-dimensional space. This relationship is captured using Partial Least Squares (PLS), a discriminative feature projection method. Due to the nature of PLS, our method focuses on keeping discriminative filters. As a consequence, we are able to remove up to 60% of FLOPs while improving network accuracy. We show that our criterion is superior to existing pruning criteria, which include state-of-the-art feature selection techniques and handcrafted approaches. Compared to state-of-the-art pruning strategies, our method achieves the best tradeoff between drop/improvement in accuracy and FLOPs reduction.

Keywords: convolutional networks; network; least squares; accuracy; deep network; partial least

Journal Title: Neurocomputing
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.