4.7 Article

Under canopy light detection and ranging-based autonomous navigation

Journal

JOURNAL OF FIELD ROBOTICS
Volume 36, Issue 3, Pages 547-567

Publisher

WILEY
DOI: 10.1002/rob.21852

Keywords

agriculture; perception; terrestrial robotics

Categories

Funding

  1. Sao Paulo Research Foundation (Fapesp) [2017/00033-7 2013/07276-1]
  2. Coordination of Superior Level Staff Improvement [001]
  3. NSF SBIR [1820332]
  4. Advanced Research Projects Agency - Energy [DE-AR0000598]
  5. Fundacao de Amparo a Pesquisa do Estado de Sao Paulo (FAPESP) [17/00033-7] Funding Source: FAPESP
  6. Directorate For Engineering [1820332] Funding Source: National Science Foundation
  7. Div Of Industrial Innovation & Partnersh [1820332] Funding Source: National Science Foundation

Ask authors/readers for more resources

This paper describes a light detection and ranging (LiDAR)-based autonomous navigation system for an ultralightweight ground robot in agricultural fields. The system is designed for reliable navigation under cluttered canopies using only a 2D Hokuyo UTM-30LX LiDAR sensor as the single source for perception. Its purpose is to ensure that the robot can navigate through rows of crops without damaging the plants in narrow row-based and high-leaf-cover semistructured crop plantations, such as corn (Zea mays) and sorghum (Sorghum bicolor). The key contribution of our work is a LiDAR-based navigation algorithm capable of rejecting outlying measurements in the point cloud due to plants in adjacent rows, low-hanging leaf cover or weeds. The algorithm addresses this challenge using a set of heuristics that are designed to filter out outlying measurements in a computationally efficient manner, and linear least squares are applied to estimate within-row distance using the filtered data. Moreover, a crucial step is the estimate validation, which is achieved through a heuristic that grades and validates the fitted row-lines based on current and previous information. The proposed LiDAR-based perception subsystem has been extensively tested in production/breeding corn and sorghum fields. In such variety of highly cluttered real field environments, the robot logged more than 6 km of autonomous run in straight rows. These results demonstrate highly promising advances to LiDAR-based navigation in realistic field environments for small under-canopy robots.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available