LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Edge QoE: Intelligent Big Data Caching via Deep Reinforcement Learning

Photo from wikipedia

In mobile edge networks (MENs), big data caching services are expected to provide mobile users with better quality of experience (QoE) than normal scenarios. However, the increasing types of sensors… Click to show full abstract

In mobile edge networks (MENs), big data caching services are expected to provide mobile users with better quality of experience (QoE) than normal scenarios. However, the increasing types of sensors and devices are producing an explosion of big data. Extracting valuable contents for caching is becoming a vital issue for the satisfaction of QoE. Therefore, it is urgent to propose some rational strategies to improve QoE, which is the major challenge for content-centric caching. This article introduces a novel big data architecture consisting of data management units for content extraction and caching decision, improving quality of service and ensuring QoE. Then a caching strategy is proposed to improve QoE, including three parts: (1) the caching location decision, which means the method of deploying caching nodes to make them closer to users; (2) caching capacity assessment, which aims to seek suitable contents to match the capacity of caching nodes; and (3) caching priority choice, which leads to contents being cached according to their priority to meet user demands. With this architecture and strategy, we particularly use a caching algorithm based on deep reinforcement learning to achieve lower cost for intelligent caching. Experimental results indicate that our schemes achieve higher QoE than existing algorithms.

Keywords: data caching; reinforcement learning; deep reinforcement; qoe; big data; edge

Journal Title: IEEE Network
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.