4.6 Article

Robust Pixel-Level Crack Detection Using Deep Fully Convolutional Neural Networks

期刊

出版社

ASCE-AMER SOC CIVIL ENGINEERS
DOI: 10.1061/(ASCE)CP.1943-5487.0000854

关键词

Crack detection and segmentation; Fully convolutional neural network; Deep learning; Infrastructure; Pixel level; Automated inspection

向作者/读者索取更多资源

This paper introduces the idea of using deep fully convolutional neural networks for pixel-level defect detection in concrete infrastructure systems. Although coarse patch-level deep learning crack detection models abound in the literature and have shown promise, the coarse level of detail provided, together with the requirement for fixed-size input images, significantly detract from their applicability and usefulness for refined damage analysis. The deep fully convolutional model for crack detection introduced in this paper (CrackPix) leverages well-known image classification architectures for dense predictions by transforming their fully connected layers into convolutional filters. A transposed convolution layer is then used to upsample and resize the resulting prediction heatmap to the size of the input images, thus providing pixel-level predictions. To develop and train these models, a concrete crack image data set was collected and carefully annotated at the pixel level and was then used to train the model. Sensitivity analysis showed that CrackPix was capable of correctly detecting over 92% of crack pixels and 99.9% of noncrack pixels in the validation set. The model performance was then compared against a state-of-the-art patchwise model, as well as traditional edge detection and adaptive thresholding alternatives, and its advantages were illustrated. The success of CrackPix, which enables the quantification of crack characteristics (e.g., width and length) in concrete structures, provides a key step toward automated inspection and quality assurance for infrastructure in future smart cities.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据