LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Learning Continuous-Time Dynamics With Attention

Photo by jontyson from unsplash

Learning the hidden dynamics from sequence data is crucial. Attention mechanism can be introduced to spotlight on the region of interest for sequential learning. Traditional attention was measured between a… Click to show full abstract

Learning the hidden dynamics from sequence data is crucial. Attention mechanism can be introduced to spotlight on the region of interest for sequential learning. Traditional attention was measured between a query and a sequence based on a discrete-time state trajectory. Such a mechanism could not characterize the irregularly-sampled sequence data. This paper presents an attentive differential network (ADN) where the attention over continuous-time dynamics is developed. The continuous-time attention is performed over the dynamics at all time. The missing information in irregular or sparse samples can be seamlessly compensated and attended. Self attention is computed to find the attended state trajectory. However, the memory cost for attention score between a query and a sequence is demanding since self attention treats all time instants as query points in an ordinary differential equation solver. This issue is tackled by imposing the causality constraint in causal ADN (CADN) where the query is merged up to current time. To enhance the model robustness, this study further explores a latent CADN where the attended dynamics are calculated in an encoder-decoder structure via Bayesian learning. Experiments on the irregularly-sampled actions, dialogues and bio-signals illustrate the merits of the proposed methods in action recognition, emotion recognition and mortality prediction, respectively.

Keywords: time; continuous time; time dynamics; learning continuous; sequence; attention

Journal Title: IEEE Transactions on Pattern Analysis and Machine Intelligence
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.