LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Agent-Based Energy Sharing Mechanism Using Deep Deterministic Policy Gradient Algorithm

Photo from wikipedia

Balancing energy generation and consumption is essential for smoothing the power grids. The mismatch between energy supply and demand would not only increase the cost on both sides, but also… Click to show full abstract

Balancing energy generation and consumption is essential for smoothing the power grids. The mismatch between energy supply and demand would not only increase the cost on both sides, but also has a great impact on the stability of the system. This paper proposes a novel energy sharing mechanism (ESM) to facilitate the consumption of local energy. With the help of the ESM, multiple prosumers have an opportunity to share surplus energy with neighboring prosumers. The problem is formulated as a leader–follower framework based on the Stackelberg game theory. To address the aforementioned problems, a deep deterministic policy gradient (DDPG) is applied to solve the Nash equilibrium (NE). The numerical results demonstrate that the proposed method is more stable than the conventional reinforcement learning (RL) algorithm. Moreover, the proposed method can converge to NE and find a relatively good energy sharing (ES) pricing strategy without knowing the specific system information. In short, it is notable that the proposed ESM can be seen as a win–win strategy for both prosumers and the power system.

Keywords: deep deterministic; sharing mechanism; deterministic policy; energy sharing; policy gradient; energy

Journal Title: Energies
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.