LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Deep-Reinforcement-Learning-Based Resource Allocation for Cloud Gaming via Edge Computing

Photo by hajjidirir from unsplash

Compared with cloud computing, edge computing is capable of effectively solving the high latency problem in cloud gaming. However, there are still several challenges to address for optimizing system performance.… Click to show full abstract

Compared with cloud computing, edge computing is capable of effectively solving the high latency problem in cloud gaming. However, there are still several challenges to address for optimizing system performance. On the one hand, the unpredictable bursts of game requests can cause server overload and network congestion. On the other hand, the mobility of players makes the system highly dynamic. Although existing research has studied game fairness and latency separately to improve the Quality of Experience (QoE), a tradeoff between fairness and latency has been largely ignored. Furthermore, how to balance network and computing load is identified as another constraint during optimization. Focusing on latency, fairness, and load balance simultaneously, we propose an adaptive resource allocation strategy through deep reinforcement learning (DRL) for a dynamic gaming system. The experimental results have demonstrated that the proposed algorithm outperforms the traditional optimization methods and classical reinforcement learning algorithms in solving complex multimodal reward problems.

Keywords: reinforcement learning; reinforcement; deep reinforcement; edge computing; resource allocation; cloud gaming

Journal Title: IEEE Internet of Things Journal
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.