LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Robust Speed Control of Ultrasonic Motors Based on Deep Reinforcement Learning of A Lyapunov Function

Photo from wikipedia

Speed control of ultrasonic motors (USM) needs to be precise, fast, and robust; however, this becomes a challenging task due to the nonlinear behavior of these motors including nonlinear response,… Click to show full abstract

Speed control of ultrasonic motors (USM) needs to be precise, fast, and robust; however, this becomes a challenging task due to the nonlinear behavior of these motors including nonlinear response, pull-out phenomenon, and speed hysteresis. However, linear controllers would be suboptimal and unstable, and nonlinear controllers would require expert knowledge, expensive online calculations, or costly model estimation. In this paper, we propose a model-free nonlinear offline controller that can significantly mitigate these challenges. Using deep reinforcement learning (DRL) algorithms, a neural network speed controller was optimized. A soft actor-critic (SAC) DRL algorithm was chosen due to its sample efficiency, fast convergence, and stable learning. To ensure controller stability, a custom control Lyapunov reward function was proposed. The steady-state USM behavior was mathematically modeled for easing controller design under simulation. The SAC agent was designed and trained first in simulation and then further trained experimentally. The experimental results support that the trained controller can successfully expand speed operation range ([0,300] rpm), plan optimal control trajectories, and stabilize performance under varying load torque and temperature drift.

Keywords: speed; ultrasonic motors; speed control; control; controller; control ultrasonic

Journal Title: IEEE Access
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.