Abstract Remaining useful life is the estimated continuous normal working time of a component or system from the current moment to the potential failure. The traditional methods have high trial-and-error… Click to show full abstract
Abstract Remaining useful life is the estimated continuous normal working time of a component or system from the current moment to the potential failure. The traditional methods have high trial-and-error costs and poor migration capabilities. Fortunately, the neural architecture search (NAS) that has emerged partially solves the problem of automatic construction of network models. However, the search strategy for NAS is reinforcement learning or evolutionary algorithms, which essentially search in discrete space and treating the objective function as a black box, which is very time-consuming. To solve this problem, we proposed a gradient-based neural architecture search method. This method regards a cell in the search space as a directed acyclic graph (DAG) containing N ordered nodes. Each node is a latent representation, and the directed edges represent the conversion operation of two nodes. By mixing the candidate operations (ReLU, tanh) with the softmax function, the search space becomes a continuous space and the objective function becomes a differentiable function, so gradient-based optimization methods can be used to find the optimal structure. A neural architecture search method based on gradient descent for RUL estimation, with extensive experiments showing apparently, outperforms traditional approaches as well as Long Short-Term Memory (LSTM), and it takes much less computing resources than the reinforcement neural architecture search method.
               
Click one of the above tabs to view related content.