Articles with "gradient descent" as a keyword



Thermal performance analysis and experimental study of high-speed motorized spindle based on the gradient descent method

Sign Up to like & get
recommendations!
Published in 2021 at "Case Studies in Thermal Engineering"

DOI: 10.1016/j.csite.2021.101056

Abstract: Abstract The high-speed motorized spindle is the core component of high-speed and high-precision machining, and its compact structure leads to internal heat accumulation and thermal deformation. Therefore, it is of great significance to control the… read more here.

Keywords: speed motorized; descent method; gradient descent; motorized spindle ... See more keywords
Photo by gradienta from unsplash

Conjugate gradient descent learned ANN for Indian summer monsoon rainfall and efficiency assessment through Shannon-Fano coding

Sign Up to like & get
recommendations!
Published in 2018 at "Journal of Atmospheric and Solar-Terrestrial Physics"

DOI: 10.1016/j.jastp.2018.07.015

Abstract: Abstract Work reported in the present paper demonstrates a neurocomputing based predictive model for the average rainfall in India during the season of summer monsoon. Backpropagation method with Conjugate Gradient Descent algorithm has been implemented… read more here.

Keywords: shannon fano; gradient descent; fano coding; conjugate gradient ... See more keywords

Riemannian gradient descent methods for graph-regularized matrix completion

Sign Up to like & get
recommendations!
Published in 2020 at "Linear Algebra and its Applications"

DOI: 10.1016/j.laa.2020.06.010

Abstract: Abstract Low-rank matrix completion is the problem of recovering the missing entries of a data matrix by using the assumption that the true matrix admits a good low-rank approximation. Much attention has been given recently… read more here.

Keywords: matrix completion; completion; low rank; gradient descent ... See more keywords

Application of a novel metaheuristic algorithm inspired by Adam gradient descent in distributed permutation flow shop scheduling problem and continuous engineering problems

Sign Up to like & get
recommendations!
Published in 2025 at "Scientific Reports"

DOI: 10.1038/s41598-025-01678-9

Abstract: Over the past few years, numerous swarm intelligence-based metaheuristic algorithms have been introduced and extensively applied. Although these algorithms draw on biological behaviors, their similar heuristic paradigms and modular designs lead to unbalanced exploration and… read more here.

Keywords: gradient descent; engineering; adam gradient; gradient ... See more keywords

Energy–entropy competition and the effectiveness of stochastic gradient descent in machine learning

Sign Up to like & get
recommendations!
Published in 2018 at "Molecular Physics"

DOI: 10.1080/00268976.2018.1483535

Abstract: ABSTRACT Finding parameters that minimise a loss function is at the core of many machine learning methods. The Stochastic Gradient Descent (SGD) algorithm is widely used and delivers state-of-the-art results for many problems. Nonetheless, SGD… read more here.

Keywords: machine learning; gradient descent; energy; physics ... See more keywords

Exact characterisation of asymptotic running time for approximate gradient descent on random graphs

Sign Up to like & get
recommendations!
Published in 2025 at "Stochastic Analysis and Applications"

DOI: 10.1080/07362994.2024.2434234

Abstract: Abstract. We study the time complexity for the search of local minima in random graphs whose vertices have i.i.d. cost values. We show that, for Erdös-Rényi graphs with connection probability given by λ∕nα (with λ… read more here.

Keywords: gradient descent; approximate gradient; time; random graphs ... See more keywords

Growing neural networks: dynamic evolution through gradient descent

Sign Up to like & get
recommendations!
Published in 2025 at "Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences"

DOI: 10.1098/rspa.2025.0222

Abstract: In contrast to conventional artificial neural networks, which are structurally static, we present two approaches for evolving small networks into larger ones during training. The first method employs an auxiliary weight that directly controls network… read more here.

Keywords: gradient descent; growing neural; neural networks; size ... See more keywords

Deep-learning density functionals for gradient descent optimization.

Sign Up to like & get
recommendations!
Published in 2022 at "Physical review. E"

DOI: 10.1103/physreve.106.045309

Abstract: Machine-learned regression models represent a promising tool to implement accurate and computationally affordable energy-density functionals to solve quantum many-body problems via density functional theory. However, while they can easily be trained to accurately map ground-state… read more here.

Keywords: density functionals; density; gradient descent; ground state ... See more keywords

Aperture Shape Generation Based on Gradient Descent With Momentum

Sign Up to like & get
recommendations!
Published in 2019 at "IEEE Access"

DOI: 10.1109/access.2019.2949871

Abstract: Direct aperture optimization (DAO) is an effective method to generate high-quality intensity-modulated radiation therapy treatment plans. In generic DAO, the direction of negative gradient descent is generally used to determine the aperture shape. However, this… read more here.

Keywords: aperture; aperture shape; gradient descent; generation ... See more keywords

Limited Gradient Descent: Learning With Noisy Labels

Sign Up to like & get
recommendations!
Published in 2019 at "IEEE Access"

DOI: 10.1109/access.2019.2954547

Abstract: Label noise may affect the generalization of classifiers, and the effective learning of main patterns from samples with noisy labels is an important challenge. Recent studies have shown that deep neural networks tend to prioritize… read more here.

Keywords: validation set; clean validation; validation; limited gradient ... See more keywords

A Batch Variable Learning Rate Gradient Descent Algorithm With the Smoothing L1/2 Regularization for Takagi-Sugeno Models

Sign Up to like & get
recommendations!
Published in 2020 at "IEEE Access"

DOI: 10.1109/access.2020.2997867

Abstract: A batch variable learning rate gradient descent algorithm is proposed to efficiently train a neuro-fuzzy network of zero-order Takagi-Sugeno inference systems. By using the advantages of regularization, the smoothing $L_{1/2}$ regularization is utilized to find… read more here.

Keywords: rate; rate gradient; algorithm; learning rate ... See more keywords