Sign Up to like & get
recommendations!
1
Published in 2022 at "International Journal of Intelligent Systems"
DOI: 10.1002/int.22883
Abstract: Stochastic Gradient Descent (SGD) series optimization methods play the vital role in training neural networks, attracting growing attention in science and engineering fields of the intelligent system. The choice of learning rates affects the convergence…
read more here.
Keywords:
neural networks;
optimization methods;
learning rate;
rate decay ... See more keywords