LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Optimizing simple deterministically constructed cycle reservoir network with a Redundant Unit Pruning Auto-Encoder algorithm

Photo from wikipedia

Abstract Echo State Network (ESN) is a specific form of recurrent neural network, which displays very rich dynamics owing to its reservoir based hidden neurons. In the issue, ESN is… Click to show full abstract

Abstract Echo State Network (ESN) is a specific form of recurrent neural network, which displays very rich dynamics owing to its reservoir based hidden neurons. In the issue, ESN is viewed as a powerful approach to model real-valued time series processes. Nevertheless, ESN has been criticized for its manually experienced or brute-force searching parameters, such as initial input weights and reservoir layer weights, i.e., the conventional randomly generated ESN is unlikely to be optimal because the reservoir layer weights and input layer weights are created randomly. Simple Cycle Reservoir Network (SCRN), which constitutes a type of conclusively constructed input and internal layer weights, can yield performance comparable with conventional ESN. A Redundant Unit Pruning Auto-Encoder (RUP-AE) algorithm is proposed to optimize the input layer weights of SCRN and for resolving the dilemma of ill-conditioned output weights matrix in SCRN, through an unsupervised pre-training process. Initially, the output weights matrix of SCRN is pre-trained by pseudo-inverse algorithm through training data. Then, the pre-trained output weights matrix is pruned by a Redundant Unit Pruning (RUP) algorithm. Finally, the pruned output weights matrix of SCRN is injected to the input weights matrix to ensure the specificity of the auto-encoder. Three tasks, namely nonlinear time series system identification task, real-valued time series benchmark, and standard chaotic time series benchmark, are applied to demonstrate the advantage and superiority of RUP-AE. Extensive experimental results show that our RUP-AE is effective in improving the performance of SCRN. Meanwhile, RUP-AE is able to resolve the dilemma of ill-conditioned output weights matrix in SCRN.

Keywords: layer weights; algorithm; network; reservoir; weights matrix; redundant unit

Journal Title: Neurocomputing
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.