LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

The Diffusion Entropy Stochastic Gradient Descent Algorithm With Quasi-Optimal Combiners: Formulation and Analysis

Photo by codioful from unsplash

We consider herein the diffusion entropy stochastic gradient descent (DE-SGD) algorithm for non-circular complex-valued signals. We theoretically analyze the mean stability and the steady-state network mean-square-deviation (MSD) of the DE-SGD.… Click to show full abstract

We consider herein the diffusion entropy stochastic gradient descent (DE-SGD) algorithm for non-circular complex-valued signals. We theoretically analyze the mean stability and the steady-state network mean-square-deviation (MSD) of the DE-SGD. We develop both the quasi-optimal static and adaptive combination strategies to enhance the DE-SGD algorithm so as to utilize the spatial variations of the noise circularity coefficients, noise variances, the regressor powers and the step sizes across the entire network. We validate the proposed adaptive combiners theoretically and experimentally in small step size scenarios. We also demonstrate the consistency of the convergence and steady-state performance of both the static combiners and the proposed adaptive combiners. Illustrative simulations and real-world data evaluation validate the superior transient and steady-state performance of the DE-SGD algorithm with the proposed quasi-optimal combination strategies.

Keywords: quasi optimal; gradient descent; diffusion entropy; algorithm; stochastic gradient; entropy stochastic

Journal Title: IEEE Transactions on Signal Processing
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.