The moving object detection and tracking technology has been widely deployed in visual surveillance for security, which is, however, an extremely challenge to achieve real-time performance owing to environmental noise,… Click to show full abstract
The moving object detection and tracking technology has been widely deployed in visual surveillance for security, which is, however, an extremely challenge to achieve real-time performance owing to environmental noise, background complexity and illumination variation. This paper proposes a novel data fusion approach to attack this problem, which combines an entropy-based Canny (EC) operator with the local and global optical flow (LGOF) method, namely EC-LGOF. Its operation contains four steps. The EC operator firstly computes the contour of moving objects in a video sequence, and the LGOF method then establishes the motion vector field. Thirdly, the minimum error threshold selection (METS) method is employed to distinguish the moving object from the background. Finally, edge information fuses temporal information concerning the optic flow to label the moving objects. Experiments are conducted and the results are given to show the feasibility and effectiveness of the proposed method.
               
Click one of the above tabs to view related content.