Abstract The analysis of deep Recurrent Neural Network (RNN) models represents a research area of increasing interest. In this context, the recent introduction of Deep Echo State Networks (DeepESNs) within… Click to show full abstract
Abstract The analysis of deep Recurrent Neural Network (RNN) models represents a research area of increasing interest. In this context, the recent introduction of Deep Echo State Networks (DeepESNs) within the Reservoir Computing paradigm, enabled to study the intrinsic properties of hierarchically organized RNN architectures.In this paper we investigate the DeepESN model under a dynamical system perspective, aiming at characterizing the important aspect of stability of layered recurrent dynamics excited by external input signals.To this purpose, we develop a framework based on the study of the local Lyapunov exponents of stacked recurrent models, enabling the analysis and control of the resulting dynamical regimes. The introduced framework is demonstrated on artificial as well as real-world datasets. The results of our analysis on DeepESNs provide interesting insights on the real effect of layering in RNNs. In particular, they show that when recurrent units are organized in layers, then the resulting network intrinsically develops a richer dynamical behavior that is naturally driven closer to the edge of criticality. As confirmed by experiments on the short-term Memory Capacity task, this characterization makes the layered design effective, with respect to the shallow counterpart with the same number of units, especially in tasks that require much in terms of memory.
               
Click one of the above tabs to view related content.