Sign Up to like & get
recommendations!
0
Published in 2020 at "Neurocomputing"
DOI: 10.1016/j.neucom.2020.08.055
Abstract: Abstract Neural network learning is usually time-consuming since backpropagation needs to compute full gradients and backpropagate them across multiple layers. Despite its success of existing works in accelerating propagation through sparseness, the relevant theoretical characteristics…
read more here.
Keywords:
sparse backpropagation;
backpropagation;
memorized sparse;
loss ... See more keywords