Mobile wireless sensor networks have been extensively deployed for enhancing environmental monitoring and surveillance. The availability of low-cost mobile robots equipped with a variety of sensors makes them promising in… Click to show full abstract
Mobile wireless sensor networks have been extensively deployed for enhancing environmental monitoring and surveillance. The availability of low-cost mobile robots equipped with a variety of sensors makes them promising in target coverage tasks. They are particularly suitable where quick, inexpensive, or nonlasting visual sensing solutions are required. In this paper, we consider the problem of low complexity target tracking to cover and follow moving targets using flying robots. We tackle this problem by clustering targets while estimating the camera location and orientation for each cluster separately through a cover-set coverage method. We also leverage partial knowledge of target mobility to enhance the efficiency of our proposed algorithms. Three computationally efficient approaches are developed: predictive fuzzy, predictive incremental fuzzy, and local incremental fuzzy. The objective is to find a compromise among coverage efficiency, traveled distance, number of drones required, and complexity. The targets move according to one of the following three possible mobility patterns: random waypoint, Manhattan grid, and reference point group mobility patterns. The feasibility of our algorithms and their performance are also tested on a real-world indoor testbed called drone-be-gone, using Parrot AR.Drone quadcopters. The deployment confirms the results obtained with simulations and highlights the suitability of the proposed solutions for real-time applications.
               
Click one of the above tabs to view related content.