4.7 Article

Microgrid energy management using deep Q-network reinforcement learning

Journal

ALEXANDRIA ENGINEERING JOURNAL
Volume 61, Issue 11, Pages 9069-9078

Publisher

ELSEVIER
DOI: 10.1016/j.aej.2022.02.042

Keywords

Deep reinforcement learning; Deep Q-networks; Energy management; Microgrid

Funding

  1. King Fahd University of Petroleum & Minerals through IRC-REPS [INRE2103]
  2. KACARE Energy Research and Innovative Center (ERIC) , KFUPM

Ask authors/readers for more resources

This paper proposes a deep reinforcement learning-based approach to optimally manage different energy resources within a microgrid. The proposed methodology considers the stochastic behavior of the main elements and formulates the energy management problem as a finite horizon Markov Decision Process (MDP) without prior knowledge of transition probabilities. An efficient reinforcement learning algorithm based on deep Q-networks is implemented to solve the problem. The results of a case study based on a real microgrid demonstrate the effectiveness of the proposed methodology in achieving optimal cost-effective scheduling of energy resources under stochastic conditions.
This paper proposes a deep reinforcement learning-based approach to optimally manage the different energy resources within a microgrid. The proposed methodology considers the stochas-tic behavior of the main elements, which include load profile, generation profile, and pricing signals. The energy management problem is formulated as a finite horizon Markov Decision Process (MDP) by defining the state, action, reward, and objective functions, without prior knowledge of the tran-sition probabilities. Such formulation does not require explicit model of the microgrid, making use of the accumulated data and interaction with the microgrid to derive the optimal policy. An efficient reinforcement learning algorithm based on deep Q-networks is implemented to solve the developed formulation. To confirm the effectiveness of such methodology, a case study based on a real micro-grid is implemented. The results of the proposed methodology demonstrate its capability to obtain online scheduling of various energy resources within a microgrid with optimal cost-effective actions under stochastic conditions. The achieved costs of operation are within 2% of those obtained in the optimal schedule.(c) 2022 THE AUTHORS. Published by Elsevier BV on behalf of Faculty of Engineering, Alexandria University. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/ 4.0/).

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available