Sign Up to like & get
recommendations!
2
Published in 2022 at "IEEE Internet of Things Journal"
DOI: 10.1109/jiot.2021.3089505
Abstract: The parametrization of recurrent neural network (RNN) to solve the gradient vanishing and exploding problem is critical for sequential learning. The reason lies on eigenvalue of the gradient of loss function against the recurrent weight…
read more here.
Keywords:
weight matrix;
application;
orthographic constraints;
recurrent weight ... See more keywords