In this survey, a comprehensive study is provided, regarding the use of machine learning (ML) algorithms for effective resource management in fifth-generation and beyond (5G/B5G) wireless cellular networks. The ever-increasing… Click to show full abstract
In this survey, a comprehensive study is provided, regarding the use of machine learning (ML) algorithms for effective resource management in fifth-generation and beyond (5G/B5G) wireless cellular networks. The ever-increasing user requirements, their diverse nature in terms of performance metrics and the use of various novel technologies, such as millimeter wave transmission, massive multiple-input-multiple-output configurations and non-orthogonal multiple access, render the multi-constraint nature of the radio resource management (RRM) problem. In this context, ML and mobile edge computing (MEC) constitute a promising framework to provide improved quality of service (QoS) for end users, since they can relax the RMM-associated computational burden. In our work, a state-of-the-art analysis of ML-based RRM algorithms, categorized in terms of learning type and potential applications as well as MEC implementations,is presented, to define the best-performing solutions for various RRM sub-problems. To demonstrate the capabilities and efficiency of ML-based algorithms in RRM, we apply and compare different ML approaches for throughput prediction, as an indicative RRM task. We investigate the problem, either as a classification or as a regression one, using the corresponding metrics in each occasion. Finally, open issues, challenges and limitations concerning AI/ML approaches in RRM for 5G and B5G networks, are discussed in detail.
               
Click one of the above tabs to view related content.