LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

A Realization of Fog-RAN Slicing via Deep Reinforcement Learning

Photo from wikipedia

To meet the wide range of 5G use cases in a cost-efficient way, network slicing has been advocated as a key enabler. Unlike the core network slicing in a virtualized… Click to show full abstract

To meet the wide range of 5G use cases in a cost-efficient way, network slicing has been advocated as a key enabler. Unlike the core network slicing in a virtualized environment, radio access network (RAN) slicing is still in its infancy and the corresponding realization is challenging. In this paper, we investigate the realization approach of fog RAN slicing, where two network slice instances for hotspot and vehicle-to-infrastructure scenarios are concerned and orchestrated. In particular, the framework for RAN slicing is formulated as an optimization problem of jointly tackling content caching and mode selection, in which the time-varying channel and unknown content popularity distribution are characterized. Due to the different users’ demands and the limited resources, the complexity of original optimization problem is significant high, which makes traditional optimization approaches hard to be directly applied. To deal with this dilemma, a deep reinforcement learning algorithm is proposed, whose core idea is that the cloud server makes proper decisions on the content caching and mode selection to maximize the reward performance under the dynamical channel state and cache status. The simulation results demonstrate the performance in terms of hit ratio and sum transmit rate can be significantly improved by the proposal.

Keywords: fog ran; reinforcement learning; ran slicing; deep reinforcement; realization; network

Journal Title: IEEE Transactions on Wireless Communications
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.