We propose a novel efficient online method for tracking performers on stage. Most existing tracking methods have focused on using expensive, high-performing sensors, such as multilayer lidar sensors or high-resolution,… Click to show full abstract
We propose a novel efficient online method for tracking performers on stage. Most existing tracking methods have focused on using expensive, high-performing sensors, such as multilayer lidar sensors or high-resolution, short-range radars. Image sensor-based methods are not appropriate for tracking performers on stage because of challenging illumination conditions caused by lighting effects. In this paper, we introduce a robust multi-target tracking method based on the sensor fusing of two-dimensional distance sensors such as single-layer lidar sensors that have a relatively lower cost than the aforementioned types of sensors. In our method, measurements from each sensor are transformed into a reference coordinate system and objects are detected with those transformed measurements. Then, the object detections are used in generating or extending the trajectories of targets by detection-to-trajectory matching. In our experiments, we quantitatively evaluated the proposed method with a newly constructed dataset which consists of two scenarios simulating performances on stage. We collected the scan results of two single-layer lidar sensors and image frames captured by a camera sensor for each scenario. The experimental results show that the proposed method robustly tracks performers in challenging scenarios, in which the performers move abruptly and are densely located.
               
Click one of the above tabs to view related content.