LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Neural networks catching up with finite differences in solving partial differential equations in higher dimensions

Photo from wikipedia

Solving partial differential equations using neural networks is mostly a proof of concept approach. In the case of direct function approximation, a single neural network is constructed to be the… Click to show full abstract

Solving partial differential equations using neural networks is mostly a proof of concept approach. In the case of direct function approximation, a single neural network is constructed to be the solution of a particular boundary value problem. Independent variables are fed into the input layer, and a single output is considered as the solution’s value. The network is substituted into the equation, and the residual is then minimized with respect to the weights of the network using a gradient-based method. Our previous work showed that by minimizing all derivatives of the residual up to the third order one can obtain a machine precise solution for 2D boundary value problem using very sparse grids. The goal of this paper is to use this grid complexity advantage in order to obtain a solution faster than finite differences. However, the number of all possible high-order derivatives (and therefore the training time) increases with the number of dimensions and it was unclear whether this goal can be achieved. Here, we demonstrate that this increase can be compensated by using random directional derivatives instead. In 2D case neural networks are slower than finite differences, but for each additional dimension the complexity increases approximately 4 times for neural networks and 125 times for finite differences. This allows neural networks to catch up in 3D case for memory complexity and in 5D case for time complexity. For the first time a machine precise solution was obtained with neural network faster than with finite differences method.

Keywords: partial differential; neural networks; finite differences; solution; differential equations; solving partial

Journal Title: Neural Computing and Applications
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.