LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Adaptive Digital Twin and Multiagent Deep Reinforcement Learning for Vehicular Edge Computing and Networks

Photo by markusspiske from unsplash

Technological advancements of urban informatics and vehicular intelligence have enabled connected smart vehicles as pervasive edge computing platforms for a plethora of powerful applications. However, varies types of smart vehicles… Click to show full abstract

Technological advancements of urban informatics and vehicular intelligence have enabled connected smart vehicles as pervasive edge computing platforms for a plethora of powerful applications. However, varies types of smart vehicles with distinct capacities, diverse applications with different resource demands as well as unpredictive vehicular topology, pose significant challenges on realizing efficient edge computing services. To cope with these challenges, we incorporate digital twin technology and artificial intelligence into the design of a vehicular edge computing network. It centrally exploits potential edge service matching through evaluating cooperation gains in a mirrored edge computing system, while distributively scheduling computation task offloading and edge resource allocation in an multiagent deep reinforcement learning approach. We further propose a coordination graph driven vehicular task offloading scheme, which minimizes offloading costs through efficiently integrating service matching exploitation and intelligent offloading scheduling in both digital twin and physical networks. Numerical results based on real urban traffic datasets demonstrate the efficiency of our proposed schemes.

Keywords: digital twin; deep reinforcement; multiagent deep; edge computing; edge; vehicular edge

Journal Title: IEEE Transactions on Industrial Informatics
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.