Articles with "rate decay" as a keyword



Photo from wikipedia

An automatic learning rate decay strategy for stochastic gradient descent optimization methods in neural networks

Sign Up to like & get
recommendations!
Published in 2022 at "International Journal of Intelligent Systems"

DOI: 10.1002/int.22883

Abstract: Stochastic Gradient Descent (SGD) series optimization methods play the vital role in training neural networks, attracting growing attention in science and engineering fields of the intelligent system. The choice of learning rates affects the convergence… read more here.

Keywords: neural networks; optimization methods; learning rate; rate decay ... See more keywords