Cloud gaming has been very popular in recent years, but issues relating to maintaining low interaction delay to guarantee satisfactory user experience are still prevalent. We observe that the server-side… Click to show full abstract
Cloud gaming has been very popular in recent years, but issues relating to maintaining low interaction delay to guarantee satisfactory user experience are still prevalent. We observe that the server-side processing delay in cloud gaming system could be heavily influenced by how the resources are partitioned among processes. However, finding the optimal partitioning policy that minimizes the response delay faces several critical challenges. First, fine-grained resource partitioning is non-trivial due to the limitations of hardwre-based resource isolation techniques. Second, game wokload is highly dynamic and unpredictable, making the design of efficient resource partitioning policy more challenging. In this article, we propose an online resource partitioning framework for reducing response delay in cloud gaming, which has several promising properties. First, we divide the processes into disjoint groups and partition resources among process groups, which greatly simplifies the resource partitioning problem while ensuring high partitioning effectiveness. Second, to tackle dynamic workload changes, we classify game workloads into several clusters and maintain separate process grouping plan for each cluster. Third, we leverage reinforcement learning to adaptively choose the best actions for minimizing response delay in real time. We evaluate the proposed framework in a real cloud gaming environment using several real games. The experimental results show that our approach can reduce the response delay by 22 to 41 percent compared to a system without resource partitioning, and outperforms other resource partitioning policies significantly.
               
Click one of the above tabs to view related content.