4.7 Article

Nonintrusive Load Monitoring Based on Self-Supervised Learning

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIM.2023.3246504

关键词

Task analysis; Aggregates; Training; Neural networks; Load monitoring; Indexes; Convolutional neural networks; Deep neural network (DNN); nonintrusive load monitoring (NILM); self-supervised learning (SSL); sequence-to-point learning

向作者/读者索取更多资源

This article proposes a self-supervised learning approach that allows training deep learning models for nonintrusive load monitoring with limited labeled data. This addresses the challenge of generalizing trained models to different sites with varying load characteristics and appliance operating patterns.
Deep learning models for nonintrusive load monitoring (NILM) tend to require a large amount of labeled data for training. However, it is difficult to generalize the trained models to unseen sites due to different load characteristics and operating patterns of appliances between datasets. For addressing such problems, self-supervised learning (SSL) is proposed in this article, where labeled appliance-level data from the target dataset or house are not required. Initially, only the aggregate power readings from target dataset are required to pretrain a general network via a self-supervised pretext task to map aggregate power sequences to derived representatives. Then, supervised downstream tasks are carried out for each appliance category to fine-tune the pretrained network, where the features learned in the pretext task are transferred. Utilizing labeled source datasets enables the downstream tasks to learn how each load is disaggregated, by mapping the aggregate to labels. Finally, the fine-tuned network is applied to load disaggregation for the target sites. For validation, multiple experimental cases are designed based on three publicly accessible REDD, U.K.-DALE, and REFIT datasets. Besides, the state-of-the-art neural networks are employed to perform NILM task in the experiments. Based on the NILM results in various cases, SSL generally outperforms zero-shot learning in improving load disaggregation performance without any submetering data from the target datasets.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据