3.8 Proceedings Paper

Towards Real-time Semantic RGB-D SLAM in Dynamic Environments

Publisher

IEEE
DOI: 10.1109/ICRA48506.2021.9561743

Keywords

-

Funding

  1. Delta-NTU Corporate Laboratory for Cyber-Physical Systems - National Research Foundation (NRF) Singapore under its Corporate Laboratory@UniversityScheme

Ask authors/readers for more resources

This paper presents a real-time semantic RGB-D SLAM system for dynamic environments that can detect both known and unknown moving objects. By performing semantic segmentation on keyframes and maintaining a static map, the computational cost is reduced. Unknown moving objects are detected through clustering depth image and identifying dynamic regions via reprojection errors.
Most of the existing visual SLAM methods heavily rely on a static world assumption and easily fail in dynamic environments. Some recent works eliminate the influence of dynamic objects by introducing deep learning-based semantic information to SLAM systems. However such methods suffer from high computational cost and cannot handle unknown objects. In this paper, we propose a real-time semantic RGB-D SLAM system for dynamic environments that is capable of detecting both known and unknown moving objects. To reduce the computational cost, we only perform semantic segmentation on keyframes to remove known dynamic objects, and maintain a static map for robust camera tracking. Furthermore, we propose an efficient geometry module to detect unknown moving objects by clustering the depth image into a few regions and identifying the dynamic regions via their reprojection errors. The proposed method is evaluated on public datasets and real-world conditions. To the best of our knowledge, it is one of the first semantic RGB-D SLAM systems that run in real-time on a low-power embedded platform and provide high localization accuracy in dynamic environments.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available