LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Dynamic energy dispatch strategy for integrated energy system based on improved deep reinforcement learning

Photo from wikipedia

Abstract Dynamic energy dispatch is an integral part of the operation optimization of integrated energy systems (IESs). Most existing dynamic dispatch schemes depend heavily on explicit forecast or mathematical models… Click to show full abstract

Abstract Dynamic energy dispatch is an integral part of the operation optimization of integrated energy systems (IESs). Most existing dynamic dispatch schemes depend heavily on explicit forecast or mathematical models of the future uncertainties. Due to the randomness of renewable energy generation and energy demands, these approaches are limited by the accuracy of forecasting or model. A novel model-free dynamic dispatch strategy for IES based on improved deep reinforcement learning (DRL) is proposed to solve the problem. The IES dynamic dispatch problem is formulated as a Markov decision process (MDP), in which the uncertainties of renewable generation, electric load and heat load are considered. For solving the MDP, an improved deep deterministic policy gradient (DDPG) algorithm using prioritized experience replay mechanism and L2 regularization is developed, so as to improve the policy quality and learning efficiency of the dispatch strategy. The proposed approach does not require any forecast information or distribution knowledge, and can adaptively respond to the stochastic fluctuations of the supply and demands. Simulation results show the proposed dispatch strategy has faster convergence and lower operating costs than original DDPG-based strategy. In addition, the advantages of the proposed approach in terms of cost-effectiveness and stochastic environmental adaptation are validated.

Keywords: dispatch strategy; dynamic energy; improved deep; energy; dispatch

Journal Title: Energy
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.