LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Intelligent Navigation of Indoor Robot Based on Improved DDPG Algorithm

Photo by rocknrollmonkey from unsplash

Targeting the problem of autonomous navigation of indoor robots in large-scale, complicated, and unknown environments, an autonomous online decision-making algorithm based on deep reinforcement learning is put forward in this… Click to show full abstract

Targeting the problem of autonomous navigation of indoor robots in large-scale, complicated, and unknown environments, an autonomous online decision-making algorithm based on deep reinforcement learning is put forward in this paper. Traditional path planning methods rely on the environment modeling, which can cause more workload of calculating. In this paper, the sensors to detect surrounding obstacles are combined with the DDPG (deep deterministic policy gradient) algorithm to input environmental perception and control the action direct output, which enables robots to complete the tasks of autonomous navigation and distribution without relying on environment modeling. In addition, the algorithm preprocesses the relevant data in the learning sample with Gaussian noise, facilitating the agent to adapt to noisy training environment and improve its robustness. The simulation results show that the optimized DL-DDPG algorithm is more efficient on online decision-making for the indoor robot navigation system, which enables the robot to complete autonomous navigation and intelligent control independently.

Keywords: indoor robot; ddpg algorithm; navigation; navigation indoor

Journal Title: Mathematical Problems in Engineering
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.