LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Array-Level Programming of 3-Bit per Cell Resistive Memory and Its Application for Deep Neural Network Inference

Photo by nci from unsplash

The requirement of multilevel cell (MLC) resistive random access memory (RRAM) for computing is different than that for MLC storage. It generally requires a linearly spaced conductance median and an… Click to show full abstract

The requirement of multilevel cell (MLC) resistive random access memory (RRAM) for computing is different than that for MLC storage. It generally requires a linearly spaced conductance median and an ultratight conductance distribution, as the column current are summed up for analog computation. In this article, 3-bit per cell RRAM that is suitable for accurate inference of a deep neural network (DNN) is demonstrated, with ultratight conductance distribution (<1.5% sigma). First, a two-loop write–verify protocol is proposed. Then, statistical experiments are conducted on RRAM array fabricated in Winbond’s 90-nm process. By incorporating the measured conductance distribution into DNN simulation considering the real weight mapping, inference accuracy with only 0.5% degradation over software baseline is achieved for CIFAR-10 data set even when 128 rows are read-out in parallel. By enabling parallel read-out, the system-level energy efficiency and throughput could be improved by $5.3 \times $ and $4.4 \times $ , respectively, compared to the 3-bit per cell RRAM used as MLC storage.

Keywords: inference; deep neural; per cell; cell; bit per

Journal Title: IEEE Transactions on Electron Devices
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.