Articles with "bit width" as a keyword



Photo by lureofadventure from unsplash

Low Bit-Width Convolutional Neural Network on RRAM

Sign Up to like & get
recommendations!
Published in 2020 at "IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems"

DOI: 10.1109/tcad.2019.2917852

Abstract: The emerging resistive random-access memory (RRAM) has been widely applied in accelerating the computing of deep neural networks. However, it is challenging to achieve high-precision computations based on RRAM due to the limits of the… read more here.

Keywords: low bit; rram; width convolutional; bit width ... See more keywords
Photo by dimeloper from unsplash

A High Performance Multi-Bit-Width Booth Vector Systolic Accelerator for NAS Optimized Deep Learning Neural Networks

Sign Up to like & get
recommendations!
Published in 2022 at "IEEE Transactions on Circuits and Systems I: Regular Papers"

DOI: 10.1109/tcsi.2022.3178474

Abstract: Multi-bit-width convolutional neural network (CNN) maintains the balance between network accuracy and hardware efficiency, thus enlightening a promising method for accurate yet energy-efficient edge computing. In this work, we develop state-of-the-art multi-bit-width accelerator for NAS… read more here.

Keywords: bit width; network; nas optimized; multi bit ... See more keywords
Photo from wikipedia

Parallel-Prefix Adder in Spin-Orbit Torque Magnetic RAM for High Bit-Width Non-Volatile Computation

Sign Up to like & get
recommendations!
Published in 2023 at "IEEE Transactions on Circuits and Systems II: Express Briefs"

DOI: 10.1109/tcsii.2022.3214504

Abstract: Recently, many computing-in-memory (CIM) systems based on non-volatile devices have been implemented well. However, they perform poorly in high bit-width processes due to device access latency and energy cost. In this brief, we present an… read more here.

Keywords: high bit; spin orbit; bit width; prefix adder ... See more keywords
Photo by dulhiier from unsplash

Residual Quantization for Low Bit-Width Neural Networks

Sign Up to like & get
recommendations!
Published in 2023 at "IEEE Transactions on Multimedia"

DOI: 10.1109/tmm.2021.3124095

Abstract: Neural network quantization has shown to be an effective way for network compression and acceleration. However, existing binary or ternary quantization methods suffer from two major issues. First, low bit-width input/activation quantization easily results in… read more here.

Keywords: network; residual quantization; quantization; low bit ... See more keywords