LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

An Interpretation of Long Short-Term Memory Recurrent Neural Network for Approximating Roots of Polynomials

Photo from wikipedia

This paper aims to present a flexible method for interpreting the Long Short-Term Memory Recurrent Neural Network (LSTM-RNN) for the relational structure between the roots and the coefficients of a… Click to show full abstract

This paper aims to present a flexible method for interpreting the Long Short-Term Memory Recurrent Neural Network (LSTM-RNN) for the relational structure between the roots and the coefficients of a polynomial. A database is first developed for randomly selected inputs based on the degrees of the univariate polynomial which is then used to approximate the polynomial roots through the proposed LSTM-RNN model. Furthermore, an adaptive learning optimization algorithm is used specifically to update the network weights iteratively based on training deep neural networks data. Thus, the method can exploit the ability to find the individual learning rates for each variable through adaptive learning rate strategies to effectively prevent the weights from fluctuating in a wide spectrum. Finally, several experimental results are performed which shows that the proposed LSTM-RNN model can be used as an alternative approach to compute an approximation of each root for a given polynomial. Furthermore, the results are compared with the conventional feedforward neural network based artificial neural network model. The results clearly demonstrate the superiority of the proposed LSTM-RNN model for roots approximation in terms of accuracy, mean square error and faster convergence.

Keywords: neural network; network; term memory; memory recurrent; long short; short term

Journal Title: IEEE Access
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.