Cloud computing, now a mature computing service, provides an efficient and economical solution for processing big data. As such, it attracts a lot of attention in academia and plays an… Click to show full abstract
Cloud computing, now a mature computing service, provides an efficient and economical solution for processing big data. As such, it attracts a lot of attention in academia and plays an important role in industrial applications. With the recent increase in the scale of cloud computing data centers, and improvements in user service quality requirements, the structure of the whole cloud system has become more complex, which has also made the resource scheduling management of these systems more challenging. Thus, the goal of this research was to resolve the conflict between cloud service providers (CSPs) who aim to minimize energy costs and those who seek to optimize service quality. Based on the excellent environmental awareness and online adaptive decision-making ability of deep reinforcement learning (DRL), we proposed an online resource scheduling framework based on the Deep Q-network (DQN) algorithm. The framework could make a trade-off of the two optimization objectives of energy consumption and task makespan by adjusting the proportion of the reward of different optimization objectives. Experimental results showed that this framework could effectively be used to make a trade-off of the two optimization objectives of energy consumption and task makespan, and exhibited obvious optimization effects compared with the baseline algorithm. Therefore, our proposed framework can dynamically adjust the optimization objective of the system according to the different requirements of the cloud system.
               
Click one of the above tabs to view related content.