LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Sigmoid and Beyond: Algebraic Activation Functions for Artificial Neural Networks Based on Solutions of a Riccati Equation

Photo from wikipedia

Activation functions play a key role in neural networks, as they significantly affect the training process and the network’s performance. Based on the solution of a certain ordinary differential equation… Click to show full abstract

Activation functions play a key role in neural networks, as they significantly affect the training process and the network’s performance. Based on the solution of a certain ordinary differential equation of the Riccati type, this work proposes an alternative generalized adaptive solution to the fixed sigmoid, which is called “generalized Riccati activation” (GRA). The proposed GRA function was employed on the output layer of an artificial neural network with a single hidden layer that consisted of eight neurons. The performance of the neural network was evaluated on a binary and a multiclass classification problem using different combinations of activation functions in the input/output layers. The results demonstrated that the swish/GRA combination yields higher accuracy than any other combination of activation functions. This benefit in terms of accuracy could be critical for certain domains, such as healthcare and smart grids, where AI-assisted decisions are becoming essential.

Keywords: activation functions; neural networks; equation; sigmoid; artificial neural; activation

Journal Title: IT Professional
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.