The management of remote services, such as remote surgery, remote sensing, or remote driving, has become increasingly important, especially with the emerging 5G and Beyond 5G technologies. However, the strict… Click to show full abstract
The management of remote services, such as remote surgery, remote sensing, or remote driving, has become increasingly important, especially with the emerging 5G and Beyond 5G technologies. However, the strict network requirements of these remote services represent one of the major challenges that hinder their fast and large-scale deployment in critical infrastructures. This article addresses certain issues inherent in remote and immersive control of virtual reality (VR)-based unmanned aerial vehicles (UAVs), whereby a user remotely controls UAVs, equipped with 360° cameras, using their head-mounted devices (HMD) and their respective controllers. Remote and immersive control services, using 360° video streams, require much lower latency and higher throughput for true immersion and high service reliability. To assess and analyze these requirements, this article introduces a real-life testbed system that leverages different technologies (e.g., VR, 360° video streaming over 4G/5G, and edge computing). In the performance evaluation, different latency types are considered. They are namely: 1) glass-to-glass latency between the 360° camera of a remote UAV and the HMD display; 2) user/pilot’s reaction latency; and 3) the command/execution latency. The obtained results indicate that the responsiveness (dubbed Glass-to-Reaction-to-Execution—GRE–latency) of a pilot, using our system, to a sudden event is within an acceptable range, i.e., around 900 ms.
               
Click one of the above tabs to view related content.