4.5 Article

Application of multisensor fusion to develop a personal location and 3D mapping system

期刊

OPTIK
卷 172, 期 -, 页码 328-339

出版社

ELSEVIER GMBH, URBAN & FISCHER VERLAG
DOI: 10.1016/j.ijleo.2018.07.029

关键词

Simultaneous localization and mapping; Kinect; Inertial measurement unit; Laser range finder; Extended Kalman filter

类别

向作者/读者索取更多资源

With the popularization of indoor pedestrian positioning systems in recent years, many 3D indoor map-building methods and stereoscopic maps for visualization purposes have been developed. This study presents a human-portable 3D simultaneous localization and mapping (SLAM) system that can adapt to various environmental scenarios. Based on RGB-D SLAM and IMU/laser SLAM, we propose a sensor fusion SLAM algorithm that combines the advantages of these two algorithms and fuses Microsoft Kinect, a Hokuyo laser range finder (LRF), and inertial measurement unit (IMU) sensors to implement 3D positioning and mapping. In terms of positioning, the data from the IMU is used to estimate the user's velocity and attitude. To correct drift from the inertial sensor, the update procedure of the extended Kalman filter in this system mainly depends on displacement, which is estimated from the Kinect sensor. When the displacement estimated from the Kinect sensor is judged as a failure, the walking velocity estimated from the LRF is used to update the extended Kalman filter, Finally, the colored point cloud extracted from the Kinect is used to create dense 3D environmental maps. Experimental results for three different scenarios prove that our proposed algorithm surpasses the other two.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据