This paper proposes a technique to identify nonlinear dynamical systems with time delay. The sparse optimization algorithm is extended to nonlinear systems with time delay. The proposed algorithm combines cross-validation… Click to show full abstract
This paper proposes a technique to identify nonlinear dynamical systems with time delay. The sparse optimization algorithm is extended to nonlinear systems with time delay. The proposed algorithm combines cross-validation techniques from machine learning for automatic model selection and an algebraic operation for preprocessing signals to filter the noise and for removing the dependence on initial conditions. We further integrate the bootstrapping resampling technique with the sparse regression to obtain the statistical properties of estimation. We use Taylor expansion to parameterize time delay. The proposed algorithm in this paper is computationally efficient and robust to noise. A nonlinear Duffing oscillator is simulated to demonstrate the efficiency and accuracy of the proposed technique. An experimental example of a nonlinear rotary flexible joint is presented to further validate the proposed method.
               
Click one of the above tabs to view related content.