LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

A Novel Lidar-Assisted Monocular Visual SLAM Framework for Mobile Robots in Outdoor Environments

Photo from wikipedia

In this article, a novel 3-D lidar-assisted monocular visual simultaneous localization and mapping (LAMV-SLAM) framework is proposed for mobile robots in outdoor environments. LAMV-SLAM can run in real time without… Click to show full abstract

In this article, a novel 3-D lidar-assisted monocular visual simultaneous localization and mapping (LAMV-SLAM) framework is proposed for mobile robots in outdoor environments. LAMV-SLAM can run in real time without a GPU and build a dense map with real scale. An online photometric calibration thread is integrated into LAMV-SLAM to eliminate the photometric disturbances in images. The tracking thread combines the lidar and vision data to estimate and refine the frame-to-frame transformation. In this thread, the depth fusion algorithm is proposed to provide accurate depth values for the extracted visual features by combining the lidar points, and a novel two-stage optimization method is proposed to utilize the fused lidar–vision data to estimate the camera transformation with the real scale. A parallel mapping thread generates new map points based on depth filter and lidar–vision data fusion. A loop closing thread further reduces the accumulative errors of the system. To verify the accuracy and efficiency of the system, we evaluated the proposed pipeline on the KITTI odometry benchmark, and our LAMV-SLAM achieves a 0.81% of relative position drift while running at over $3\times $ real-time speed. To verify the robustness of the system in challenging environments, experiments were carried out on the North Campus Long-Term (NCLT) and NuScenes datasets. Moreover, real-world experiments were applied to our mobile robot platform to show the practicability and validity of the proposed approach.

Keywords: monocular visual; slam; assisted monocular; novel lidar; lidar assisted; lidar

Journal Title: IEEE Transactions on Instrumentation and Measurement
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.