4.6 Article

On the accuracy of void fraction measurements by single-beam gamma-densitometry for gas-liquid two-phase flows in pipes

Journal

EXPERIMENTAL THERMAL AND FLUID SCIENCE
Volume 28, Issue 6, Pages 533-544

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.expthermflusci.2003.08.003

Keywords

-

Ask authors/readers for more resources

Gamma-densitometry is a widespread method of measuring non-intrusively the local void fraction of gas-liquid two-phase pipe flows. Using standard single-beam gamma-densitometry, an accurate correlation between the loss of radiation intensity in a test volume and its void fraction provides knowledge of the prevailing flow regime. These regimes are mathematically difficult to describe, due to their complex structure. Therefore the flow regime dependence is usually ignored and the loss of intensity is related to the void fraction by universal linear or logarithmic approximations. This work refers to the accuracy of these approximations, as the achievable accuracy of single-beam gamma-densitometry is highly important in practical applications. The deviations of both approximations are calculated numerically for modeled flow patterns. A dimensionless relation is deduced, which gives their accuracy as a function of the pipe radius and the absorption coefficient of the liquid phase. Experiments with water-air flows in a 21 mm diameter pipe and an Iodine-125 gamma-source confirm the numerical calculations. A method is outlined to deduce the achievable accuracy in practical use directly from the mass flows, the radiation energy and the pipe radius. (C) 2003 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available