4.7 Article

Real-Time Localization and Mapping Utilizing Multi-Sensor Fusion and Visual-IMU-Wheel Odometry for Agricultural Robots in Unstructured, Dynamic and GPS-Denied Greenhouse Environments

Journal

AGRONOMY-BASEL
Volume 12, Issue 8, Pages -

Publisher

MDPI
DOI: 10.3390/agronomy12081740

Keywords

real-time localization; multi-sensor fusion; dense 3D mapping greenhouse; autonomous navigation

Funding

  1. National Natural Science Foundation of China [31901415]
  2. Jiangsu Agricultural Science and Technology Innovation Fund (JASTIF) [CX (21)3146]

Ask authors/readers for more resources

This study proposes an advanced real-time localization and mapping system that achieves precise pose estimation and dense 3D mapping in complex greenhouses through multi-sensor fusion and visual-inertial-wheel odometry. The proposed system integrates measurements from wheel odometry, IMU, and VIO into a loosely coupled framework based on the EKF to obtain more accurate robot state estimation. The dense 3D map of the greenhouse is reconstructed in real-time using modified ORB-SLAM2. Experimental results demonstrate the robustness of the method in coping with various challenges and show that the proposed framework can improve the localization accuracy of VIO, highlighting its promising applications in autonomous navigation of agricultural robots.
Autonomous navigation in greenhouses requires agricultural robots to localize and generate a globally consistent map of surroundings in real-time. However, accurate and robust localization and mapping are still challenging for agricultural robots due to the unstructured, dynamic and GPS-denied environmental conditions. In this study, a state-of-the-art real-time localization and mapping system was presented to achieve precise pose estimation and dense three-dimensional (3D) point cloud mapping in complex greenhouses by utilizing multi-sensor fusion and Visual-IMU-Wheel odometry. In this method, measurements from wheel odometry, an inertial measurement unit (IMU) and a tightly coupled visual-inertial odometry (VIO) are integrated into a loosely coupled framework based on the Extended Kalman Filter (EKF) to obtain a more accurate state estimation of the robot. In the multi-sensor fusion algorithm, the pose estimations from the wheel odometry and IMU are treated as predictions and the localization results from VIO are used as observations to update the state vector. Simultaneously, the dense 3D map of the greenhouse is reconstructed in real-time by employing the modified ORB-SLAM2. The performance of the proposed system was evaluated in modern standard solar greenhouses with harsh environmental conditions. Taking advantage of measurements from individual sensors, our method is robust enough to cope with various challenges, as shown by extensive experiments conducted in the greenhouses and outdoor campus environment. Additionally, the results show that our proposed framework can improve the localization accuracy of the visual-inertial odometry, demonstrating the satisfactory capability of the proposed approach and highlighting its promising applications in autonomous navigation of agricultural robots.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available