LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Reinforcement Learning-based Trajectory Optimization for Data Muling with Underwater Mobile Nodes

Photo by yannispap from unsplash

This manuscript addresses the trajectory optimization for underwater data muling with mobile nodes. In the underwater data muling scenario, multiple autonomous underwater vehicles (AUVs) explore or sample a mission area… Click to show full abstract

This manuscript addresses the trajectory optimization for underwater data muling with mobile nodes. In the underwater data muling scenario, multiple autonomous underwater vehicles (AUVs) explore or sample a mission area and autonomous surface vehicles (ASVs) visit underway AUVs to retrieve collected data. The optimization objectives are to simultaneously maximize fairness in data transmissions and minimize the travel distance of the surface nodes. We propose a nearest-K reinforcement learning algorithm. In the algorithm, we choose only from the nearest-K AUVs as candidates for the next node for data transmissions. We choose the distance between AUVs and the ASV as the state, selected AUVs as the action. A reward is designed as the function of both data volume transmitted and the ASV travel distance. In the scenario with multiple ASVs, an AUV association strategy is proposed to support the use of multiple surface nodes. We conduct computer simulations for performance evaluation. The effects from the number of AUVs, the size of the mission area, and state selection are investigated. Simulation results show that the proposed algorithm outperforms traditional methods in terms of fairness and the ASV travel distance.

Keywords: reinforcement learning; underwater; data muling; mobile nodes; trajectory optimization; optimization

Journal Title: IEEE Access
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.