Network slicing (NS) has been widely identified as a key architectural technology for 5G-and-beyond systems by supporting divergent requirements in a sustainable way. In radio access network (RAN) slicing, due… Click to show full abstract
Network slicing (NS) has been widely identified as a key architectural technology for 5G-and-beyond systems by supporting divergent requirements in a sustainable way. In radio access network (RAN) slicing, due to the device-base station (BS)-NS three layer association relationship, device association (including access control and handoff management) becomes an essential yet challenging issue. With the increasing concerns on stringent data security and device privacy, exploiting local resources to solve device association problem while enforcing data security and device privacy becomes attractive. Fortunately, recently emerging federated learning (FL), a distributed learning paradigm with data protection, provides an effective tool to address this type of issues in mobile networks. In this paper, we propose an efficient device association scheme for RAN slicing by exploiting a hybrid FL reinforcement learning (HDRL) framework, with the aim to improve network throughput while reducing handoff cost. In our proposed framework, individual smart devices train a local machine learning model based on local data and then send the model features to the serving BS/encrypted party for aggregation, so as to efficiently reduce bandwidth consumption for learning while enforcing data privacy. Specifically, we use deep reinforcement learning to train the local model on smart devices under a hybrid FL framework, where horizontal FL is employed for parameter aggregation on BS, while vertical FL is employed for NS/BS pair selection aggregation on the encrypted party. Numerical results show that the proposed HDRL scheme can achieve significant performance gain in terms of network throughput and communication efficiency in comparison with some state-of-the-art solutions.
               
Click one of the above tabs to view related content.