LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Deep Reinforcement Learning for Energy-Efficient Federated Learning in UAV-Enabled Wireless Powered Networks

Photo from wikipedia

Federated learning (FL) is a promising solution to privacy preservation for data-driven deep learning approaches. However, enabling FL in unmanned aerial vehicle (UAV)-assisted wireless networks is still challenging due to… Click to show full abstract

Federated learning (FL) is a promising solution to privacy preservation for data-driven deep learning approaches. However, enabling FL in unmanned aerial vehicle (UAV)-assisted wireless networks is still challenging due to limited resources and battery capacity in the UAV and user devices. In this regard, we propose a deep reinforcement learning (DRL)-based framework for joint UAV placement and resource allocation to enable sustainable FL with energy harvesting user devices. We aim to maximize the long-term FL performance considering the limited resources in the network, such as harvested energy, bandwidth resources, and UAV’s energy budget. To reduce the complexity of the original problem, we leverage the Lyapunov optimization technique to transform a long-term energy constraint into a deterministic problem. We reformulate the optimization problem as the framework of a Markov decision process (MDP) and design a DRL-based algorithm to solve the MDP. The proposed solution can guarantee the sustainable operation of UAV-aided wireless networks by improving energy conservation of the network in the long run.

Keywords: wireless; reinforcement learning; federated learning; deep reinforcement; energy; uav

Journal Title: IEEE Communications Letters
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.