4.4 Article

Constrained RGBD-SLAM

Journal

ROBOTICA
Volume 39, Issue 2, Pages 277-290

Publisher

CAMBRIDGE UNIV PRESS
DOI: 10.1017/S0263574720000363

Keywords

Simultaneous localization and mapping; RGBD sensor; Bundle adjustment; 3D model; RGBD-SLAM

Categories

Ask authors/readers for more resources

This paper introduces a new RGBD-SLAM system that combines visual and depth data to improve localization accuracy and takes advantage of scene knowledge enhancement. The system has been evaluated in real environments and public datasets and can be used for localization of lightweight robots, UAVs, and VR helmets.
This paper introduces a new RGBD-Simultaneous Localization And Mapping (RGBD-SLAM) based on a revisited keyframe SLAM. This solution improves the localization by combining visual and depth data in a local bundle adjustment. Then, it presents an extension of this RGBD-SLAM that takes advantage of a partial knowledge of the scene. This solution allows using a prior knowledge of the 3D model of the environment when this latter is available which drastically improves the localization accuracy. The proposed solutions called RGBD-SLAM and Constrained RGBD-SLAM are evaluated on several public benchmark datasets and on real scenes acquired by a Kinect sensor. The system works in real time on a standard central processing units and it can be useful for certain applications, such as localization of lightweight robots, UAVs, and VR helmet.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.4
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available