The paper presents research on the approximation of variable-order fractional operators by recurrent neural networks. The research focuses on two basic variable-order fractional operators, i.e., integrator and differentiator. The study… Click to show full abstract
The paper presents research on the approximation of variable-order fractional operators by recurrent neural networks. The research focuses on two basic variable-order fractional operators, i.e., integrator and differentiator. The study includes variations of the order of each fractional operator. The recurrent neural network architecture based on GRU (Gated Recurrent Unit) cells functioned as a neural approximation for selected fractional operators. The paper investigates the impact of the number of neurons in the hidden layer, treated as a hyperparameter, on the quality of modeling error. Training of the established recurrent neural network was performed on synthetic data sets. Data for training was prepared based on the modified Grünwald-Letnikov definition of variable-order fractional operators suitable for convenient numerical computing without memory effects. The research presented in this paper showed that recurrent network architecture based on GRU-type cells can satisfactorily approximate targeted simple yet functional variable-order fractional operators with minor modeling errors. In addition, the research also compares the presented solution with basic and recurrent neural networks that utilize Tapped Delay Lines (TDL) in their structure. The presented solution is a novel approach to the approximation of VO-FC operators. It has the advantage of automatic selection of neural approximator parameters by optimization based on data customized for specific requirements.
               
Click one of the above tabs to view related content.