4.7 Article

3DInvNet: A Deep Learning-Based 3D Ground-Penetrating Radar Data Inversion

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TGRS.2023.3275306

Keywords

Three-dimensional displays; Permittivity; Image reconstruction; Noise reduction; Feature extraction; Deep learning; Reflection; denoising; ground-penetrating radar (GPR); permittivity reconstruction

Ask authors/readers for more resources

This paper proposes a 3D deep learning scheme called 3DInvNet for reconstructing 3D permittivity maps from ground-penetrating radar (GPR) data, which is important for mapping subsurface environments and inspecting underground structural integrity.
The reconstruction of the 3D permittivity map from ground-penetrating radar (GPR) data is of great importance for mapping subsurface environments and inspecting underground structural integrity. Traditional iterative 3D reconstruction algorithms suffer from strong nonlinearity, ill-posedness, and high computational costs. To tackle these issues, a 3D deep learning scheme, called 3DInvNet, is proposed to reconstruct 3D permittivity maps from GPR C-scans. The proposed scheme leverages a prior 3D convolutional neural network (CNN) with a feature attention mechanism to suppress the noise in the C-scans due to subsurface heterogeneous soil environments. Then a 3D U-shaped encoder-decoder network with multiscale feature aggregation (MSFA) modules is designed to establish the optimal inverse mapping from the denoised C-scans to 3D permittivity maps. Furthermore, a three-step separate learning strategy is employed to pretrain and fine-tune the networks. The proposed scheme is applied to numerical simulation as well as real measurement data. The quantitative and qualitative results show the networks' capability, generalizability, and robustness in denoising GPR C-scans and reconstructing 3D permittivity maps of subsurface objects.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available