Articles with "variable learning" as a keyword



Photo by codioful from unsplash

A Batch Variable Learning Rate Gradient Descent Algorithm With the Smoothing L1/2 Regularization for Takagi-Sugeno Models

Sign Up to like & get
recommendations!
Published in 2020 at "IEEE Access"

DOI: 10.1109/access.2020.2997867

Abstract: A batch variable learning rate gradient descent algorithm is proposed to efficiently train a neuro-fuzzy network of zero-order Takagi-Sugeno inference systems. By using the advantages of regularization, the smoothing $L_{1/2}$ regularization is utilized to find… read more here.

Keywords: rate; rate gradient; algorithm; learning rate ... See more keywords