4.7 Article

Real-Time Localization and Mapping Utilizing Multi-Sensor Fusion and Visual-IMU-Wheel Odometry for Agricultural Robots in Unstructured, Dynamic and GPS-Denied Greenhouse Environments

期刊

AGRONOMY-BASEL
卷 12, 期 8, 页码 -

出版社

MDPI
DOI: 10.3390/agronomy12081740

关键词

real-time localization; multi-sensor fusion; dense 3D mapping greenhouse; autonomous navigation

资金

  1. National Natural Science Foundation of China [31901415]
  2. Jiangsu Agricultural Science and Technology Innovation Fund (JASTIF) [CX (21)3146]

向作者/读者索取更多资源

This study proposes an advanced real-time localization and mapping system that achieves precise pose estimation and dense 3D mapping in complex greenhouses through multi-sensor fusion and visual-inertial-wheel odometry. The proposed system integrates measurements from wheel odometry, IMU, and VIO into a loosely coupled framework based on the EKF to obtain more accurate robot state estimation. The dense 3D map of the greenhouse is reconstructed in real-time using modified ORB-SLAM2. Experimental results demonstrate the robustness of the method in coping with various challenges and show that the proposed framework can improve the localization accuracy of VIO, highlighting its promising applications in autonomous navigation of agricultural robots.
Autonomous navigation in greenhouses requires agricultural robots to localize and generate a globally consistent map of surroundings in real-time. However, accurate and robust localization and mapping are still challenging for agricultural robots due to the unstructured, dynamic and GPS-denied environmental conditions. In this study, a state-of-the-art real-time localization and mapping system was presented to achieve precise pose estimation and dense three-dimensional (3D) point cloud mapping in complex greenhouses by utilizing multi-sensor fusion and Visual-IMU-Wheel odometry. In this method, measurements from wheel odometry, an inertial measurement unit (IMU) and a tightly coupled visual-inertial odometry (VIO) are integrated into a loosely coupled framework based on the Extended Kalman Filter (EKF) to obtain a more accurate state estimation of the robot. In the multi-sensor fusion algorithm, the pose estimations from the wheel odometry and IMU are treated as predictions and the localization results from VIO are used as observations to update the state vector. Simultaneously, the dense 3D map of the greenhouse is reconstructed in real-time by employing the modified ORB-SLAM2. The performance of the proposed system was evaluated in modern standard solar greenhouses with harsh environmental conditions. Taking advantage of measurements from individual sensors, our method is robust enough to cope with various challenges, as shown by extensive experiments conducted in the greenhouses and outdoor campus environment. Additionally, the results show that our proposed framework can improve the localization accuracy of the visual-inertial odometry, demonstrating the satisfactory capability of the proposed approach and highlighting its promising applications in autonomous navigation of agricultural robots.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据