This literature review extends and contributes to research on the development of data-driven optimal control. Previous reviews have documented the development of model-based and data-driven control in isolation and have… Click to show full abstract
This literature review extends and contributes to research on the development of data-driven optimal control. Previous reviews have documented the development of model-based and data-driven control in isolation and have not critically reviewed reinforcement learning approaches for adaptive data-driven optimal control frameworks. The presented review discusses the development of model-based to model-free adaptive controllers, highlighting the use of data in control frameworks. In data-driven control frameworks, reinforcement learning methods may be used to derive the optimal policy for dynamical systems. Attractive characteristics of these methods include not requiring a mathematical model of complex systems, their inherent adaptive control capabilities, being an unsupervised learning technique and their decision-making abilities, which are both an advantage and motivation behind this approach. This review considers previous reviews on these topics, including recent work on data-driven control methods. In addition, this review shows the use of data to derive system dynamics, determine the control policy using feedback information, and tune fixed controllers. Furthermore, the review summarises various data-driven methods and their corresponding characteristics. Finally, the review provides a taxonomy, a timeline and a concise narrative of the development of model-based to model-free data-driven adaptive control and underlines the limitations of these techniques due to the lack of theoretical analysis. Areas of further work include theoretical analysis on stability and robustness for data-driven control systems, explainability of black-box policy learning techniques and an evaluation of the impact of the extension of system simulators to include digital twins.
               
Click one of the above tabs to view related content.