Sign Up to like & get
recommendations!
2
Published in 2023 at "IEEE Journal of Solid-State Circuits"
DOI: 10.1109/jssc.2022.3198413
Abstract: Computing-in-memory (CIM) is an attractive approach for energy-efficient deep neural network (DNN) processing, especially for low-power edge devices. However, today’s typical DNNs usually exceed CIM-static random access memory (SRAM) capacity. The introduced off-chip communication covers…
read more here.
Keywords:
level sparsity;
quantization;
bit level;
cim ... See more keywords
Sign Up to like & get
recommendations!
2
Published in 2023 at "IEEE transactions on neural networks and learning systems"
DOI: 10.1109/tnnls.2023.3250437
Abstract: With the rapid progress of deep neural network (DNN) applications on memristive platforms, there has been a growing interest in the acceleration and compression of memristive networks. As an emerging model optimization technique for memristive…
read more here.
Keywords:
memristive networks;
sparsity tolerant;
level sparsity;
bit level ... See more keywords