This brief is concerned with the issue of dissipativity analysis for delayed recurrent neural networks. First, a flexible negative-definiteness determination method is presented, which brings more flexibility and can further… Click to show full abstract
This brief is concerned with the issue of dissipativity analysis for delayed recurrent neural networks. First, a flexible negative-definiteness determination method is presented, which brings more flexibility and can further reduce the conservatism of some existing methods. Second, by employing the flexible negative-definiteness determination method and some integral inequalities, a tight upper bound of the Lyapunov-Krasovkii functional derivative can be derived. Then, a less conservative delay-dependent criterion is derived to guarantee delayed recurrent neural networks strictly dissipative. Finally, simulations are provided to confirm the superiority of our result.
               
Click one of the above tabs to view related content.