Articles with "relu" as a keyword



Photo by jsrm99 from unsplash

Power efficient ReLU design for neuromorphic computing using spin Hall effect

Sign Up to like & get
recommendations!
Published in 2023 at "Journal of Physics D: Applied Physics"

DOI: 10.1088/1361-6463/acdae1

Abstract: We demonstrate a magnetic tunnel junction injected with spin Hall current to exhibit linear rotation of magnetization of the free-ferromagnet using only the spin current. Using the linear resistance change of the MTJ, we devise… read more here.

Keywords: relu; physics; circuit; power ... See more keywords

RMAF: Relu-Memristor-Like Activation Function for Deep Learning

Sign Up to like & get
recommendations!
Published in 2020 at "IEEE Access"

DOI: 10.1109/access.2020.2987829

Abstract: Activation functions facilitate deep neural networks by introducing non-linearity to the learning process. The non-linearity feature gives the neural network the ability to learn complex patterns. Recently, the most widely used activation function is the… read more here.

Keywords: activation; rmaf; activation function; relu ... See more keywords
Photo from wikipedia

ComPreEND: Computation Pruning through Predictive Early Negative Detection for ReLU in a Deep Neural Network Accelerator

Sign Up to like & get
recommendations!
Published in 2022 at "IEEE Transactions on Computers"

DOI: 10.1109/tc.2021.3092205

Abstract: A vast amount of activation values of DNNs are zeros due to ReLU (Rectified Linear Unit), which is one of the most common activation functions used in modern neural networks. Since ReLU outputs zero for… read more here.

Keywords: early negative; computation pruning; negative detection; relu ... See more keywords