4.7 Article

Visual Navigation Algorithm for Night Landing of Fixed-Wing Unmanned Aerial Vehicle

期刊

AEROSPACE
卷 9, 期 10, 页码 -

出版社

MDPI
DOI: 10.3390/aerospace9100615

关键词

fixed-wing unmanned aerial vehicle; low-illumination image enhancement; gradient descent schemes; faster R-CNN; orthogonal iteration

资金

  1. Interdisciplinary Innovation Fund For Doctoral Students of Nanjing University of Aeronautics and Astronautics [KXKCXJJ202203]
  2. Postgraduate Research & Practice Innovation Program of Jiangsu Province [KYCX20_0210]

向作者/读者索取更多资源

This paper proposes a novel vision-based autonomous landing navigation scheme for night-time autonomous landing of fixed-wing UAV. The proposed algorithm includes visible and infrared image fusion, improved Faster R-CNN based runway detection, and relative attitude and position estimation for the UAV.
In the recent years, visual navigation has been considered an effective mechanism for achieving an autonomous landing of Unmanned Aerial Vehicles (UAVs). Nevertheless, with the limitations of visual cameras, the effectiveness of visual algorithms is significantly limited by lighting conditions. Therefore, a novel vision-based autonomous landing navigation scheme is proposed for night-time autonomous landing of fixed-wing UAV. Firstly, due to the difficulty of detecting the runway caused by the low-light image, a strategy of visible and infrared image fusion is adopted. The objective functions of the fused and visible image, and the fused and infrared image, are established. Then, the fusion problem is transformed into the optimal situation of the objective function, and the optimal solution is realized by gradient descent schemes to obtain the fused image. Secondly, to improve the performance of detecting the runway from the enhanced image, a runway detection algorithm based on an improved Faster region-based convolutional neural network (Faster R-CNN) is proposed. The runway ground-truth box of the dataset is statistically analyzed, and the size and number of anchors in line with the runway detection background are redesigned based on the analysis results. Finally, a relative attitude and position estimation method for the UAV with respect to the landing runway is proposed. New coordinate reference systems are established, six landing parameters, such as three attitude and three positions, are further calculated by Orthogonal Iteration (OI). Simulation results reveal that the proposed algorithm can achieve 1.85% improvement of AP on runway detection, and the reprojection error of rotation and translation for pose estimation are 0.675 degrees and 0.581%, respectively.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据