LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Relation-Aware Attentive Neural Processes Model for Remaining Useful Life Prediction

Photo from wikipedia

For the remaining useful life (RUL) prediction task, the temporal information under the acquired data is crucial for prediction accuracy. The typical deep learning model for RUL prediction is mainly… Click to show full abstract

For the remaining useful life (RUL) prediction task, the temporal information under the acquired data is crucial for prediction accuracy. The typical deep learning model for RUL prediction is mainly achieved by the recurrent neural network (RNN) or convolutional neural network (CNN) with a time window. However, the CNN model cannot explicitly extract the global temporal information of the sequence. The RNN model suffers from the slow forward computation speed due to its sequential structure. This article proposes the relation-aware attentive neural processes (R-ANPs) model to solve the RUL prediction problem. The local relation-aware self-attention model first processes the input time series, which explicitly fuses the local temporal information between adjacent sequence points. Then, the RUL is obtained by feeding the processed data into the attentive neural processes (ANPs) model. The relation-aware self-attention model and ANPs model can be trained and inferred in parallel to accelerate the training process. The proposed R-ANPs model has three innovative advantages: 1) the relation-aware self-attention model is used to fuse local temporal information into each data point explicitly; 2) the output of the model contains the standard deviation, which offers the uncertainty measurement for prediction results; and 3) the training data are served as context for the ANPs model to exploit their valuable information at the prediction stage. The effectiveness of the proposed model is validated on a run-to-failure dataset. The results demonstrate that the proposed model outperforms recent RNN- and CNN-based models.

Keywords: relation aware; prediction; model; neural processes; attentive neural

Journal Title: IEEE Transactions on Instrumentation and Measurement
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.