4.5 Article

Improving Non-Intrusive Load Disaggregation through an Attention-Based Deep Neural Network

Journal

ENERGIES
Volume 14, Issue 4, Pages -

Publisher

MDPI
DOI: 10.3390/en14040847

Keywords

attention mechanism; deep neural network; energy disaggregation; non-intrusive load monitoring

Categories

Funding

  1. ENEA within the PAR2018 project
  2. Department of Civil and Computer Engineering of the University of Rome Tor Vergata

Ask authors/readers for more resources

The paper proposes a deep neural network to solve NILM problem, which outperforms existing techniques in the experiments. By incorporating a specific attention mechanism, the network is able to correctly detect the states of appliances and locate sections with high power consumption.
Energy disaggregation, known in the literature as Non-Intrusive Load Monitoring (NILM), is the task of inferring the power demand of the individual appliances given the aggregate power demand recorded by a single smart meter which monitors multiple appliances. In this paper, we propose a deep neural network that combines a regression subnetwork with a classification subnetwork for solving the NILM problem. Specifically, we improve the generalization capability of the overall architecture by including an encoder-decoder with a tailored attention mechanism in the regression subnetwork. The attention mechanism is inspired by the temporal attention that has been successfully applied in neural machine translation, text summarization, and speech recognition. The experiments conducted on two publicly available datasets-REDD and UK-DALE-show that our proposed deep neural network outperforms the state-of-the-art in all the considered experimental conditions. We also show that modeling attention translates into the network's ability to correctly detect the turning on or off an appliance and to locate signal sections with high power consumption, which are of extreme interest in the field of energy disaggregation.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available