LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Deep Reinforcement Learning-Based Charging Pricing for Autonomous Mobility-on-Demand System

Photo from wikipedia

The autonomous mobility-on-demand (AMoD) system plays an important role in the urban transportation system. The charging behavior of AMoD fleet becomes a critical link between charging system and transportation system.… Click to show full abstract

The autonomous mobility-on-demand (AMoD) system plays an important role in the urban transportation system. The charging behavior of AMoD fleet becomes a critical link between charging system and transportation system. In this paper, we investigate a strategic charging pricing scheme for charging station operators (CSOs) based on a non-cooperative Stackelberg game framework. The Stackelberg equilibrium investigates the pricing competition among multiple CSOs, and explores the nexus between the CSOs and AMoD operator. In the proposed framework, the responsive behavior of AMoD operator (order-serving, repositioning, and charging) is formulated as a multi-commodity network flow model to solve an energy-aware traffic flow problem. Meanwhile, a soft actor-critic based multi-agent deep reinforcement learning algorithm is developed to solve the proposed equilibrium framework while considering privacy-conservation constraints among CSOs. A numerical case study with city-scale real-world data is used to validate the effectiveness of the proposed framework.

Keywords: autonomous mobility; system; reinforcement learning; deep reinforcement; charging pricing; mobility demand

Journal Title: IEEE Transactions on Smart Grid
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.