Sign Up to like & get
recommendations!
0
Published in 2023 at "IEEE Computer Architecture Letters"
DOI: 10.1109/lca.2023.3275909
Abstract: Deep neural networks (DNNs) require abundant multiply-and-accumulate (MAC) operations. Thanks to DNNs’ ability to accommodate noise, some of the computational burden is commonly mitigated by quantization–that is, by using lower precision floating-point operations. Layer granularity…
read more here.
Keywords:
training efficiency;
enhancing dnn;
dynamic asymmetric;
architecture ... See more keywords