LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Dissimilarity-Preserving Representation Learning for One-Class Time Series Classification.

Photo from wikipedia

We propose to embed time series in a latent space where pairwise Euclidean distances (EDs) between samples are equal to pairwise dissimilarities in the original space, for a given dissimilarity… Click to show full abstract

We propose to embed time series in a latent space where pairwise Euclidean distances (EDs) between samples are equal to pairwise dissimilarities in the original space, for a given dissimilarity measure. To this end, we use auto-encoder (AE) and encoder-only neural networks to learn elastic dissimilarity measures, e.g., dynamic time warping (DTW), that are central to time series classification (Bagnall et al., 2017). The learned representations are used in the context of one-class classification (Mauceri et al., 2020) on the datasets of UCR/UEA archive (Dau et al., 2019). Using a 1-nearest neighbor (1NN) classifier, we show that learned representations allow classification performance that is close to that of raw data, but in a space of substantially lower dimensionality. This implies substantial and compelling savings in terms of computational and storage requirements for nearest neighbor time series classification.

Keywords: time; time series; classification; dissimilarity; series classification

Journal Title: IEEE transactions on neural networks and learning systems
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.