4.5 Article

Demand Response in HEMSs Using DRL and the Impact of Its Various Configurations and Environmental Changes

Journal

ENERGIES
Volume 15, Issue 21, Pages -

Publisher

MDPI
DOI: 10.3390/en15218235

Keywords

deep learning; reinforcement learning; deep Q-networks; home energy management system; demand response

Categories

Funding

  1. Qatar National Research Fund (a member of the Qatar Foundation) [NPRP11S-1202-170052]

Ask authors/readers for more resources

This paper examines the performance of deep reinforcement learning (DRL) in addressing demand response (DR) issues in home energy management systems (HEMS). By studying the effects of various DRL configurations on HEMS performance, it is found that properly configured DRL can successfully schedule a significant portion of appliances and reduce electricity costs in different simulation scenarios.
With smart grid advances, enormous amounts of data are made available, enabling the training of machine learning algorithms such as deep reinforcement learning (DRL). Recent research has utilized DRL to obtain optimal solutions for complex real-time optimization problems, including demand response (DR), where traditional methods fail to meet time and complex requirements. Although DRL has shown good performance for particular use cases, most studies do not report the impacts of various DRL settings. This paper studies the DRL performance when addressing DR in home energy management systems (HEMSs). The trade-offs of various DRL configurations and how they influence the performance of the HEMS are investigated. The main elements that affect the DRL model training are identified, including state-action pairs, reward function, and hyperparameters. Various representations of these elements are analyzed to characterize their impact. In addition, different environmental changes and scenarios are considered to analyze the model's scalability and adaptability. The findings elucidate the adequacy of DRL to address HEMS challenges since, when appropriately configured, it successfully schedules from 73% to 98% of the appliances in different simulation scenarios and minimizes the electricity cost by 19% to 47%.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available