We present new conditions for asymptotic stability and exponential stability of a class of stochastic recurrent neural networks with discrete and distributed time varying delays. Our approach is based on… Click to show full abstract
We present new conditions for asymptotic stability and exponential stability of a class of stochastic recurrent neural networks with discrete and distributed time varying delays. Our approach is based on the method using fixed point theory, which do not resort to any Liapunov function or Liapunov functional. Our results neither require the boundedness, monotonicity and differentiability of the activation functions nor differentiability of the time varying delays. In particular, a class of neural networks without stochastic perturbations is also considered. Examples are given to illustrate our main results.
               
Click one of the above tabs to view related content.