LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Event recognition method based on dual-augmentation for a Φ-OTDR system with a few training samples.

Photo by victorfreitas from unsplash

Thanks to the development of machine learning and deep learning, data-driven pattern recognition based on neural network is a trend for Φ-OTDR system intrusion event recognition. The data-driven pattern recognition… Click to show full abstract

Thanks to the development of machine learning and deep learning, data-driven pattern recognition based on neural network is a trend for Φ-OTDR system intrusion event recognition. The data-driven pattern recognition needs a large number of samples for training. However, in some scenarios, intrusion signals are difficult to collect, resulting in the lack of training samples. At the same time, labeling a large number of samples is also a very time-consuming work. This paper presents a few-shot learning classification method based on time series transfer and cycle generative adversarial network (CycleGAN) data augmentation for Φ-OTDR system. By expanding the rare samples based on time series transfer and CycleGAN, the number of samples in the dataset can finally meet the requirement of network training. The experimental result shows that even when the training set has two minor classes with only two samples, the average accuracy of the validation set with 5 classification tasks can still reach 90.84%, and the classification accuracy of minor classes can reach 79.28% with the proposed method.

Keywords: recognition; training; otdr system; method

Journal Title: Optics express
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.