LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Multi-Armed Bandit Learning for Cache Content Placement in Vehicular Social Networks

Photo by hajjidirir from unsplash

In this letter, the efficient dissemination of content in a socially-aware cache-enabled hybrid network using multi-armed bandit learning theory is analyzed. Specifically, an overlay cellular network over a vehicular social… Click to show full abstract

In this letter, the efficient dissemination of content in a socially-aware cache-enabled hybrid network using multi-armed bandit learning theory is analyzed. Specifically, an overlay cellular network over a vehicular social network is considered, where commuters request for multimedia content from either the stationary road-side units (RSUs), the base station, or the single mobile cache unit (MCU), if accessible. Firstly, we propose an algorithm to optimally distribute popular contents among the locally deployed RSU caches. To further maximize the cache hits experienced by vehicles, we then present an algorithm to find the best traversal path for the MCU based on commuters’ social degree distribution. For performance evaluation, the asymptotic regret upper bounds of the two algorithms are also derived. Simulations reveal that the proposed algorithms outperform existing content placement methods in terms of overall network throughput.

Keywords: multi armed; cache; bandit learning; vehicular social; content placement; armed bandit

Journal Title: IEEE Communications Letters
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.