LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Tracking objects with partial occlusion by background alignment

Abstract Visual object tracking is a challenging and fundamental research topic in the field of computer vision. Recent years, many subspace-learning based methods have been proposed for visual object tracking… Click to show full abstract

Abstract Visual object tracking is a challenging and fundamental research topic in the field of computer vision. Recent years, many subspace-learning based methods have been proposed for visual object tracking with promising results. These methods reconstruct candidate states by a set of basis vectors and select the best state with minimum reconstruction error. It is well known that the accuracy of reconstructed results is seriously affected by partial occlusion. Besides, updating with occlusions is likely to mislead the tracker to drift away. Existing methods either do not consider these situations or find occlusion regions only by current observation and reconstruction. In fact, occlusion regions usually come from the background regions in previous frames, which are neglected in existing methods. Under this assumption, a novel object tracking algorithm called Partial Occlusion by Background Alignment (POBA) is proposed, which aims to find the best candidate state with an accurate occlusion mask. The POBA tracker treats current observation as a combination of object appearance and occlusion regions. The object appearance is modelled by basis vectors through incremental PCA over grayscale images. Then the occlusion region is reconstructed from last frame under the assumption that the backgrounds between two consecutive frames are almost identical. Besides, most candidate states are different from the object obviously, which can be filtered by some predefined occlusion masks so that computational complexity can be further reduced. Finally, the POBA tracker was analyzed on 8 challenge sequences and evaluated on two challenging datasets including OTB2015 and Temple Color. It gets an AUC of 0.456, a success rate score of 0.538 and a precision score of 0.626 in OPE on OTB2015 dataset. All indicators are increased by more than 23% compared with the 6 classical trackers.

Keywords: background alignment; occlusion background; occlusion; partial occlusion

Journal Title: Neurocomputing
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.