LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

A Deep RL-Based Algorithm for Coordinated Charging of Electric Vehicles

Photo from wikipedia

The development of electric vehicle (EV) industry is facing a series of issues, among which the efficient charging of multiple EVs needs solving desperately. This paper investigates the coordinated charging… Click to show full abstract

The development of electric vehicle (EV) industry is facing a series of issues, among which the efficient charging of multiple EVs needs solving desperately. This paper investigates the coordinated charging of multiple EVs with the aim of reducing the charging cost, ensuring a high battery state of charge (SoC), and avoiding the transformer overload. To this end, we first formulate the EV coordinated charging problem with the above multiple objectives as a Markov Decision Process (MDP) and then propose a multi-agent deep reinforcement learning (DRL)-based algorithm. In the proposed algorithm, a novel interaction model, i.e., communication neural network (CommNet) model, is adopted to realize the distributed computation of global information (namely the electricity price, the transformer load, and the total charging cost of multiple EVs). Moreover, different from the most existing works which make specific constraints on the size, the location, or the topology of the distribution network, what we need in the proposed method is only the transformer load. Besides, due to the use of long and short-term memory (LSTM) for price prediction, the proposed algorithm can flexibly deal with various uncertain price mechanisms. Finally, simulations are presented to verify the effectiveness and practicability of the proposed algorithm in a residential charging area.

Keywords: algorithm coordinated; multiple evs; deep based; coordinated charging; based algorithm; proposed algorithm

Journal Title: IEEE Transactions on Intelligent Transportation Systems
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.