Journal
JOURNAL OF FIELD ROBOTICS
Volume 36, Issue 3, Pages 547-567Publisher
WILEY
DOI: 10.1002/rob.21852
Keywords
agriculture; perception; terrestrial robotics
Categories
Funding
- Sao Paulo Research Foundation (Fapesp) [2017/00033-7 2013/07276-1]
- Coordination of Superior Level Staff Improvement [001]
- NSF SBIR [1820332]
- Advanced Research Projects Agency - Energy [DE-AR0000598]
- Fundacao de Amparo a Pesquisa do Estado de Sao Paulo (FAPESP) [17/00033-7] Funding Source: FAPESP
- Directorate For Engineering [1820332] Funding Source: National Science Foundation
- Div Of Industrial Innovation & Partnersh [1820332] Funding Source: National Science Foundation
Ask authors/readers for more resources
This paper describes a light detection and ranging (LiDAR)-based autonomous navigation system for an ultralightweight ground robot in agricultural fields. The system is designed for reliable navigation under cluttered canopies using only a 2D Hokuyo UTM-30LX LiDAR sensor as the single source for perception. Its purpose is to ensure that the robot can navigate through rows of crops without damaging the plants in narrow row-based and high-leaf-cover semistructured crop plantations, such as corn (Zea mays) and sorghum (Sorghum bicolor). The key contribution of our work is a LiDAR-based navigation algorithm capable of rejecting outlying measurements in the point cloud due to plants in adjacent rows, low-hanging leaf cover or weeds. The algorithm addresses this challenge using a set of heuristics that are designed to filter out outlying measurements in a computationally efficient manner, and linear least squares are applied to estimate within-row distance using the filtered data. Moreover, a crucial step is the estimate validation, which is achieved through a heuristic that grades and validates the fitted row-lines based on current and previous information. The proposed LiDAR-based perception subsystem has been extensively tested in production/breeding corn and sorghum fields. In such variety of highly cluttered real field environments, the robot logged more than 6 km of autonomous run in straight rows. These results demonstrate highly promising advances to LiDAR-based navigation in realistic field environments for small under-canopy robots.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available