Multisource fusion localization is a mainstream scheme for acquiring accurate locations in complex indoor scenes. To overcome the interference of indoor structures on radio and illumination variation on visual features,… Click to show full abstract
Multisource fusion localization is a mainstream scheme for acquiring accurate locations in complex indoor scenes. To overcome the interference of indoor structures on radio and illumination variation on visual features, the semantic maps provide an effective way for multisource fusion localization. However, due to the lack of visual depth information, solutions of indoor semantic maps suffer from large semantic segmentation errors for similar objects, which leads to the unstable performance of localization systems. To overcome the issue in semantic and fusion localization, we develop a localization system to demonstrate the use of restudy semantic map and self‐adapting fusion localization would achieve centimeter‐level positioning accuracy, termed VISEL. VISEL uses the proposed spatial attention‐aware semantic model to enhance the discrimination of semantic features for capturing accurate semantic maps. On the basis of high‐precision semantic maps, VISEL completes an enhanced particle filter fusion localization module with adaptive reassign weight to different localization modules, which successfully improves accuracy through complementary advantages between different signals while overcoming the drawbacks of each signal and interference of complex environment. The extensive experimental results show that VISEL outperforms current state‐of‐the‐art positioning systems and achieves an average positioning accuracy of 0.4 m. VISEL utilizes semantic maps with depth features and enhanced particle filter to reduce the fusion localization error by 38%, which suggests the high‐precision semantic maps with depth features could provide a robust solution for the fusion localization system for indoor complex scenes.
               
Click one of the above tabs to view related content.