Sign Up to like & get
recommendations!
0
Published in 2018 at "Circuits, Systems, and Signal Processing"
DOI: 10.1007/s00034-017-0572-z
Abstract: In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the…
read more here.
Keywords:
novel fractional;
based learning;
neural networks;
gradient based ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2020 at "Filomat"
DOI: 10.2298/fil2015173l
Abstract: We propose a differential evolution algorithm based on adaptive fractional gradient descent (DE-FGD) to address the defects of existing bio-inspired algorithms, such as slow convergence speed and local optimum. The crossover and selection processes of…
read more here.
Keywords:
optimization;
fractional gradient;
differential evolution;
gradient descent ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2022 at "Axioms"
DOI: 10.3390/axioms11100507
Abstract: Motivated by the weighted averaging method for training neural networks, we study the time-fractional gradient descent (TFGD) method based on the time-fractional gradient flow and explore the influence of memory dependence on neural network training.…
read more here.
Keywords:
time fractional;
gradient descent;
fractional gradient;
gradient ... See more keywords