4.6 Article

Volumetric Occupancy Mapping With Probabilistic Depth Completion for Robotic Navigation

Journal

IEEE ROBOTICS AND AUTOMATION LETTERS
Volume 6, Issue 3, Pages 5072-5079

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LRA.2021.3070308

Keywords

Computer vision; simultaneous localisation and mapping; mobile robots; machine learning

Categories

Funding

  1. EPSRC ORCA Robotics Hub [EP/R026173/1]
  2. EPSRC [Aerial ABMEP/N018494/1]
  3. Imperial College London
  4. SLAMcore Ltd.
  5. Deutsche Forschungsgemeinschaft (DFG, GermanResearch Foundation) [EXC2070 -390732324]

Ask authors/readers for more resources

This study proposes a framework that utilizes deep learning for depth completion to enhance the ability to map obstacle-free space in 3D environments. Experimental results show that the approach maps significantly more correct free space with relatively low error in different indoor environments compared to using raw data alone.
In robotic applications, a key requirement for safe and efficient motion planning is the ability to map obstacle-free space in unknown, cluttered 3D environments. However, commodity-grade RGB-D cameras commonly used for sensing fail to register valid depth values on shiny, glossy, bright, or distant surfaces, leading to missing data in the map. To address this issue, we propose a framework leveraging probabilistic depth completion as an additional input for spatial mapping. We introduce a deep learning architecture providing uncertainty estimates for the depth completion of RGB-D images. Our pipeline exploits the inferred missing depth values and depth uncertainty to complement raw depth images and improve the speed and quality of free space mapping. Evaluations on synthetic data show that our approach maps significantly more correct free space with relatively low error when compared against using raw data alone in different indoor environments; thereby producing more complete maps that can be directly used for robotic navigation tasks. The performance of our framework is validated using real-world data.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available