LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

On the investigation of activation functions in gradient neural network for online solving linear matrix equation

Photo from wikipedia

Abstract In this paper, we investigate different activation functions (AFs) on convergence performance of a gradient-based neural network (GNN) for solving linear matrix equation, AXB + X = C .… Click to show full abstract

Abstract In this paper, we investigate different activation functions (AFs) on convergence performance of a gradient-based neural network (GNN) for solving linear matrix equation, AXB + X = C . It is observed that, by employing different AFs, i.e., linear, power-sigmoid, sign-power, and general sign-bi-power functions, the presented GNN model can achieve different convergence performance. More specifically, if linear function is employed, the GNN model can achieve exponential convergence; if the power-sigmoid function is employed, superior convergence can be achieved as compared to the linear case; while if the sign-power and general sign-bi-power functions are employed, the GNN model can achieve finite- and fixed-time convergence, respectively. Detailed theoretical proofs are offered to demonstrate these facts. Besides, the exponential convergence rate and the upper bounds of finite and fixed convergence time are also theoretically estimated. Finally, two illustrative examples are performed to further substantiate the aforementioned theoretical results and the effectiveness of the presented GNN model for solving the linear matrix equation.

Keywords: matrix equation; power; linear matrix; convergence; solving linear

Journal Title: Neurocomputing
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.