LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Initializing LSTM internal states via manifold learning

Photo from wikipedia

We present an approach, based on learning an intrinsic data manifold, for the initialization of the internal state values of long short-term memory (LSTM) recurrent neural networks, ensuring consistency with… Click to show full abstract

We present an approach, based on learning an intrinsic data manifold, for the initialization of the internal state values of long short-term memory (LSTM) recurrent neural networks, ensuring consistency with the initial observed input data. Exploiting the generalized synchronization concept, we argue that the converged, "mature" internal states constitute a function on this learned manifold. The dimension of this manifold then dictates the length of observed input time series data required for consistent initialization. We illustrate our approach through a partially observed chemical model system, where initializing the internal LSTM states in this fashion yields visibly improved performance. Finally, we show that learning this data manifold enables the transformation of partially observed dynamics into fully observed ones, facilitating alternative identification paths for nonlinear dynamical systems.

Keywords: lstm internal; initializing lstm; via manifold; states via; internal states; manifold

Journal Title: Chaos
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.