4.6 Article

SLOAM: Semantic Lidar Odometry and Mapping for Forest Inventory

Journal

IEEE ROBOTICS AND AUTOMATION LETTERS
Volume 5, Issue 2, Pages 612-619

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LRA.2019.2963823

Keywords

Robotics in agriculture and forestry; SLAM; deep learning in robotics and automation; virtual reality and interfaces

Categories

Funding

  1. Trefo LLC under NSF SBIR Grant [193856]
  2. ARL [ARL DCIST CRA 2911NF-17-2-0181]
  3. ONR [N00014-07-1-0829]
  4. ARO [W911NF-13-1-0350]
  5. INCT-INSac Grants [CNPq 465755/2014-3, FAPESP 2014/50851-0, 2018/24526-5]
  6. Semiconductor Research Corporation
  7. DARPA
  8. NVIDIA (NVAIL program)

Ask authors/readers for more resources

This letter describes an end-to-end pipeline for tree diameter estimation based on semantic segmentation and lidar odometry and mapping. Accurate mapping of this type of environment is challenging since the ground and the trees are surrounded by leaves, thorns and vines, and the sensor typically experiences extreme motion. We propose a semantic feature based pose optimization that simultaneously refines the tree models while estimating the robot pose. The pipeline utilizes a custom virtual reality tool for labeling 3D scans that is used to train a semantic segmentation network. The masked point cloud is used to compute a trellis graph that identifies individual instances and extracts relevant features that are used by the SLAM module. We show that traditional lidar and image based methods fail in the forest environment on both Unmanned Aerial Vehicle (UAV) and hand-carry systems, while our method is more robust, scalable, and automatically generates tree diameter estimations.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available