Sign Up to like & get
recommendations!
1
Published in 2022 at "IEEE Transactions on Computers"
DOI: 10.1109/tc.2021.3092205
Abstract: A vast amount of activation values of DNNs are zeros due to ReLU (Rectified Linear Unit), which is one of the most common activation functions used in modern neural networks. Since ReLU outputs zero for…
read more here.
Keywords:
early negative;
computation pruning;
negative detection;
relu ... See more keywords