LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Mobile Robot Self-Localization Using Omnidirectional Vision with Feature Matching from Real and Virtual Spaces

Photo by rocknrollmonkey from unsplash

This paper presents a novel self-localization technique for mobile robots based on image feature matching from omnidirectional vision. The proposed method first constructs a virtual space with synthetic omnidirectional imaging… Click to show full abstract

This paper presents a novel self-localization technique for mobile robots based on image feature matching from omnidirectional vision. The proposed method first constructs a virtual space with synthetic omnidirectional imaging to simulate a mobile robot equipped with an omnidirectional vision system in the real world. In the virtual space, a number of vertical and horizontal lines are generated according to the structure of the environment. They are imaged by the virtual omnidirectional camera using the catadioptric projection model. The omnidirectional images derived from the virtual and real environments are then used to match the synthetic lines and real scene edges. Finally, the pose and trajectory of the mobile robot in the real world are estimated by the efficient perspective-n-point (EPnP) algorithm based on the line feature matching. In our experiments, the effectiveness of the proposed self-localization technique was validated by the navigation of a mobile robot in a real world environment.

Keywords: feature matching; self localization; mobile robot; omnidirectional vision

Journal Title: Applied Sciences
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.