The 360-degree video streaming system delivers a monocular panoramic video surrounding the user, and the user can change the viewing direction of mobile devices to see different parts of the… Click to show full abstract
The 360-degree video streaming system delivers a monocular panoramic video surrounding the user, and the user can change the viewing direction of mobile devices to see different parts of the video through the “viewport”. Due to the limited network bandwidth, playbacks of high-resolution 360-degree videos often suffer from rebuffering, while too much bandwidth is wasted in delivering those out-of-viewport parts that the user never watches. In this article, we present an Ensemble Prediction and Allocation based Streaming System, named as EPASS360, for delivering high Quality of Experience (QoE) 360-degree videos. The prediction model takes advantages of ensemble learning, providing high accuracy on the prediction of viewports. The allocation model divides a video into tiles, and allocates high resolution to tiles where a user’s viewpoint may appear in the future by solving the QoE-aware optimization problem. Trace-driven emulation on real-world datasets shows that EPASS360 enhances the QoE in various scenarios compared to state-of-the-art streaming approaches. Experiments on the head-mounted device and the hand-held device over real-world Internet confirm the high user experience of EPASS360.
               
Click one of the above tabs to view related content.