The development of simultaneous localization and mapping (SLAM) technology plays an important role in robot navigation and autonomous vehicle innovation. The ORB-SLAM2 is a unified SLAM solution for monocular, binocular,… Click to show full abstract
The development of simultaneous localization and mapping (SLAM) technology plays an important role in robot navigation and autonomous vehicle innovation. The ORB-SLAM2 is a unified SLAM solution for monocular, binocular, and RGBD cameras which constructs a sparse feature point map for real-time positioning. However, a sparse map based approach cannot effectively meet the requirements of robot navigation, environment reconstruction, and other tasks. In this paper, a dense mapping thread is added to the existing ORB-SLAM2 system. The depth map and color image obtained by the stereo matching of a binocular camera are used to generate a three-dimensional point cloud for keyframes; then, the point cloud is fused by tracking and optimizing the motion track of a feature frame to obtain a real-time point cloud map. Through the experiments conducted on the KITTI dataset and the real environment under the ROS, it is proved that the proposed system constructs a clear three-dimensional point cloud map while constructing an accurate trajectory.
               
Click one of the above tabs to view related content.