LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Parallax-Tolerant Image Stitching Based on Robust Elastic Warping

Photo by usgs from unsplash

Image stitching aims at generating high-quality panoramas with the lowest computational cost. In this paper, we propose a parallax-tolerant image stitching method based on robust elastic warping, which could achieve… Click to show full abstract

Image stitching aims at generating high-quality panoramas with the lowest computational cost. In this paper, we propose a parallax-tolerant image stitching method based on robust elastic warping, which could achieve accurate alignment and efficient processing simultaneously. Given a group of point matches between images, an analytical warping function is constructed to eliminate the parallax errors. Then, the input images are warped according to the computed deformations over the meshed image plane. The seamless panorama is composed by directly reprojecting the warped images. As an important complement to the proposed method, a Bayesian model of feature refinement is proposed to adaptively remove the incorrect local matches. This ensures a more robust alignment than existing approaches. Moreover, our warp is highly compatible with different transformation types. A flexible strategy of combining it with the global similarity transformation is provided as an example. The performance of the proposed approach is demonstrated using several challenging cases.

Keywords: image; image stitching; parallax tolerant; based robust; tolerant image; robust elastic

Journal Title: IEEE Transactions on Multimedia
Year Published: 2018

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.