LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Logic Synthesis of Binarized Neural Networks for Efficient Circuit Implementation

Photo by libraryofcongress from unsplash

Neural networks (NNs) are key to deep learning systems. Their efficient hardware implementation is crucial to applications at the edge. Binarized NNs (BNNs), where the weights and output of a… Click to show full abstract

Neural networks (NNs) are key to deep learning systems. Their efficient hardware implementation is crucial to applications at the edge. Binarized NNs (BNNs), where the weights and output of a neuron are of binary values {−1,+1} (or encoded in {0, 1}), have been proposed. As no multiplier required, BNNs are particularly attractive and suitable for hardware realization. Most prior NN synthesis methods target on hardware architectures with neural processing elements (NPEs), where the weights of a neuron are loaded and the output of the neuron is computed. The load-and-compute method, though area efficient, requires expensive memory access, which deteriorates energy and performance efficiency. In this work we aim at synthesizing BNN layers into dedicated logic circuits. We formulate the corresponding model pruning problem and matrix covering problem to reduce the area and routing cost of BNNs. For model pruning, we propose and compare three strategies at the BNN training stage. For matrix covering, we propose a scalable logic-sharing algorithm. By combining these two methods, experimental results justify the effectiveness of the method in terms of area and net savings on FPGA implementation. Our method provides an alternative implementation of BNNs, and can be applied in combination with NPE-based implementation for area, speed, and power tradeoffs.

Keywords: implementation; logic synthesis; area; neural networks; synthesis binarized

Journal Title: IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.