LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Encoding-based memory for recurrent neural networks

Photo from wikipedia

Abstract Learning to solve sequential tasks with recurrent models requires the ability to memorize long sequences and to extract task-relevant features from them. In this paper, we study memorization from… Click to show full abstract

Abstract Learning to solve sequential tasks with recurrent models requires the ability to memorize long sequences and to extract task-relevant features from them. In this paper, we study memorization from the point of view of the design and training of recurrent neural networks. We study how to maximize the short-term memory of recurrent units, an objective difficult to achieve using backpropagation. We propose a new model, the Linear Memory Network, which features an encoding-based memorization component built with a linear autoencoder for sequences. Additionally, we provide a specialized training algorithm that initializes the memory to efficiently encode the hidden activations of the network. Experimental results on synthetic and real-world datasets show that the chosen encoding mechanism is superior to static encodings such as orthogonal models and the delay line. The method also outperforms RNN and LSTM units trained using stochastic gradient descent. Experiments on symbolic music modeling show that the training algorithm specialized for the memorization component improves the final performance compared to stochastic gradient descent.

Keywords: encoding based; memory; neural networks; based memory; memory recurrent; recurrent neural

Journal Title: Neurocomputing
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.