4.7 Article

A pavement distresses identification method optimized for YOLOv5s

Journal

SCIENTIFIC REPORTS
Volume 12, Issue 1, Pages -

Publisher

NATURE PORTFOLIO
DOI: 10.1038/s41598-022-07527-3

Keywords

-

Funding

  1. Beijing Technology and Business University 2021 Graduate Research Ability Improvement Program Project of China

Ask authors/readers for more resources

This study utilizes an improved YOLOv5 model for automatic detection and recognition of pavement distresses, and introduces attention mechanism to enhance the robustness of the model. Experimental results show that the improved model can effectively identify pavement distresses on an intelligent mobile platform.
Automatic detection and recognition of pavement distresses is the key to timely repair of pavement. Repairing the pavement distresses in time can prevent the destruction of road structure and the occurrence of traffic accidents. However, some other factors, such as a single object category, shading and occlusion, make detection of pavement distresses very challenging. In order to solve these problems, we use the improved YOLOv5 model to detect various pavement distresses. We optimize the YOLOv5 model and introduce attention mechanism to enhance the robustness of the model. The improved model is more suitable for deployment in embedded devices. The optimized model is transplanted to the self-built intelligent mobile platform. Experimental results show that the improved network model proposed in this paper can effectively identify pavement distresses on the self-built intelligent mobile platform and datasets. The precision, recall and mAP are 95.5%, 94.3% and 95%. Compared with YOLOv5s and YOLOv4 models, the mAP of the improved YOLOv5s model is increased by 4.3% and 25.8%. This method can provide technical reference for pavement distresses detection robot.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available