4.7 Article

Machine learning algorithms to predict flow boiling pressure drop in mini/micro-channels based on universal consolidated data

Journal

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.ijheatmasstransfer.2021.121607

Keywords

Machine learning; Neural networks; Ann; XGBoost; KNN; Light GBM; Flow boiling; Pressure drop

Funding

  1. Office of Naval Re-search (ONR) [N0 0 014-21-1-2078]

Ask authors/readers for more resources

A study developed machine learning models to predict pressure drop in two-phase flow in mini/micro-channels, with MAEs of 9.58%, 10.38%, 13.52%, and 14.49% for the optimized ANN, XGBoost, KNN, and LightGBM models. The optimized machine-learning models outperformed traditional pressure drop correlations and showed good performance across different datasets and flow regimes.
Two-phase flow in mini/micro-channels can meet the high heat dissipation requirements of many stateof-the-art cooling solutions. However, there is lack of accurate universal methods for predicting parameters like pressure drop in these configurations. Conventional ways of predicting pressure drop employ either Homogeneous Equilibrium Model (HEM) or semi-empirical correlations. This current study leverages the availability of data collected over the past few decades to build several machine learning models to demonstrate the efficacy and ease of building and deploying such models. A consolidated database of 2787 data points for flow boiling pressure drop in mini/micro-channels is amassed from 21 sources that includes 10 working fluid, reduced pressures of 0.0006 -0.7766, hydraulic diameters of 0.15-5.35 mm, mass velocities of 33.1 < G < 2738 kg/m2 s, liquid-only Reynolds numbers of 14-27,658, superficial vapor Reynolds number of 75.58-199,453 and flow qualities of 0 and 1. This consolidated database is utilized to develop four machine learning based regression models viz., Artificial Neural Networks (ANN), KNN regression, Extreme Gradient Boosting (XGBoost) and Light GBM. Both input parameters and hyperparameters are optimized for the individual models. The models with dimensionless input parameters: Bd, Bo, Fr-f, Fr-fo, Fr-g, Fr-go, Fr-tp, Pr-f, Pr-g, Pe(g), Pe(f), Re-f, Re-fo, Re-g, Re-go, Re-eq, Su(f), Su(g), We(f), We(fo), We(g), We(go),We(tp) predict the test data for ANN model, XGBoost model, KNN model, and LightGBM model with MAEs of 9.58%, 10.38%, 13.52%, and 14.49%, respectively. The optimized machine-learning models performed better than highly reliable generalized pressure drop correlations plus showed good performance across individual datasets, flow regimes, and channel configurations. (C) 2021 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available