4.7 Article

Fault detection diagnostic for HVAC systems via deep learning algorithms

Journal

ENERGY AND BUILDINGS
Volume 250, Issue -, Pages -

Publisher

ELSEVIER SCIENCE SA
DOI: 10.1016/j.enbuild.2021.111275

Keywords

Fault detection diagnosis; machine learning; deep recurrent neural network; hyperparameter optimization

Ask authors/readers for more resources

Deep learning algorithms, particularly deep recurrent neural networks (DRNNs), have gained attention for fault detection diagnostic in HVAC systems due to their high detection accuracy. Challenges include exploring bespoke DRNN configurations and optimizing hyperparameters, but the study successfully introduces and compares different configurations to achieve high performance. The final DRNN model outperforms other advanced data-driven techniques such as random forest and gradient boosting, showcasing its effectiveness in fault detection for HVAC systems.
Because of high detection accuracy, deep learning algorithms have recently become the focus of increased attention for fault detection diagnostic (FDD) analysis of heat, ventilation, and air conditioning (HVAC) systems. Among all the machine learning algorithms in the field, deep recurrent neural networks (DRNNs) are being widely used since they are capable of learning the complex, uncertain, and temporal-dependent nature of the faults. However, embedding DRNN in FDD applications is still subject to two challenges: (I) a bespoke DRNN configuration, out of conceivably infinite DRNN architectures, is not explored on the task of FDD for HVAC systems; (II) Hyperparameter optimization, which is a compu-tationally expensive task due to its empirical nature, is not investigated. In this respect, seven DRNNs configurations are introudecd and tuned that can automatically detect faults of different degrees under the faulty and normal conditions. Then, a comprehensive study of hyperparameters is conducted to opti-mize and compare all the proposed configurations based on their accuracy and training computational time. By searching through different hidden layers and layer sizes, optimization methods, model regular-ization, and batching, the ultimate DRNN model is selected out of more than 200 experiments. All the training configuration files are publicly available. Also, a comparison is made between the proposed DRNN model and two other advanced data-driven techniques, namely, random forest (RF) and gradient boosting (GB). The final DRNN model outperforms RF and GB regression by a significant margin. (c) 2021 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available