4.7 Article

Automated crop plant detection based on the fusion of color and depth images for robotic weed control

Journal

JOURNAL OF FIELD ROBOTICS
Volume 37, Issue 1, Pages 35-52

Publisher

WILEY
DOI: 10.1002/rob.21897

Keywords

computer vision; crop detection; robotic weeding; sensor fusion

Categories

Funding

  1. National Institute of Food and Agriculture [20136702121126]
  2. Leopold Center for Sustainable Agriculture [M2009-23, M2012-24]

Ask authors/readers for more resources

Robotic weeding enables weed control near or within crop rows automatically, precisely and effectively. A computer-vision system was developed for detecting crop plants at different growth stages for robotic weed control. Fusion of color images and depth images was investigated as a means of enhancing the detection accuracy of crop plants under conditions of high weed population. In-field images of broccoli and lettuce were acquired 3-27 days after transplanting with a Kinect v2 sensor. The image processing pipeline included data preprocessing, vegetation pixel segmentation, plant extraction, feature extraction, feature-based localization refinement, and crop plant classification. For the detection of broccoli and lettuce, the color-depth fusion algorithm produced high true-positive detection rates (91.7% and 90.8%, respectively) and low average false discovery rates (1.1% and 4.0%, respectively). Mean absolute localization errors of the crop plant stems were 26.8 and 7.4 mm for broccoli and lettuce, respectively. The fusion of color and depth was proved beneficial to the segmentation of crop plants from background, which improved the average segmentation success rates from 87.2% (depth-based) and 76.4% (color-based) to 96.6% for broccoli, and from 74.2% (depth-based) and 81.2% (color-based) to 92.4% for lettuce, respectively. The fusion-based algorithm had reduced performance in detecting crop plants at early growth stages.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available