This article develops a distributed fault-tolerant consensus control (DFTCC) approach for multiagent systems by using adaptive dynamic programming. By establishing a local fault observer, the potential actuator faults of each… Click to show full abstract
This article develops a distributed fault-tolerant consensus control (DFTCC) approach for multiagent systems by using adaptive dynamic programming. By establishing a local fault observer, the potential actuator faults of each agent are estimated. Subsequently, the DFTCC problem is transformed into an optimal consensus control problem by designing a novel local value function for each agent which contains the estimated fault, the consensus errors, and the control laws of the local agent and its neighbors. In order to solve the coupled Hamilton-Jacobi-Bellman equation of each agent, a critic-only structure is established to obtain the approximate local optimal consensus control law of each agent. Moreover, by using Lyapunov's direct method, it is proven that the approximate local optimal consensus control law guarantees the uniform ultimate boundedness of the consensus error of all agents, which means that all following agents with potential actuator faults synchronize to the leader. Finally, two simulation examples are provided to validate the effectiveness of the present DFTCC scheme.
               
Click one of the above tabs to view related content.