Traditional Visual Odometry (VO) methods that utilize visible cameras frequently degrade in challenging illumination environments. Alternative vision sensors such as thermal cameras are promising for all-day navigation since the delivered… Click to show full abstract
Traditional Visual Odometry (VO) methods that utilize visible cameras frequently degrade in challenging illumination environments. Alternative vision sensors such as thermal cameras are promising for all-day navigation since the delivered thermal images are invariant to ambient illumination. However, traditional VO techniques cannot be directly translated to the thermal domain due to poor thermal image quality. Besides, the thermal cameras stop image capture during the unique imaging mechanism (e.g., Non-Uniformity Correction (NUC)), making the thermal VO easily lose tracking. In this letter, we propose a thermal-depth odometry method that can fuse information from both types of sensors, thermal and depth cameras. The system front-end estimates 6-DoF camera motion via a semi-direct framework, fully exploiting thermographic data cues from raw thermal images. The depth information is aligned with the thermal images by extrinsic parameters to enhance the robustness of motion estimation. To overcome the challenge from the NUC, the proposed method introduces an NUC handling module, which can conduct pose estimation by registering multiple point clouds generated from depth images. The proposed method is evaluated on public datasets. The results demonstrate that the proposed method can provide competitive localization performance under different illumination.
               
Click one of the above tabs to view related content.