4.2 Article

Towards functional labeling of utility vehicle point clouds for humanoid driving

期刊

INTELLIGENT SERVICE ROBOTICS
卷 7, 期 3, 页码 133-143

出版社

SPRINGER HEIDELBERG
DOI: 10.1007/s11370-014-0157-7

关键词

3-D perception; RGBD camera; Laser range-finder; Vehicle part detection and tracking

类别

向作者/读者索取更多资源

We present work on analyzing 3-D point clouds of a small utility vehicle for purposes of humanoid robot driving. The scope of this work is limited to a subset of ingress-related tasks including stepping up into the vehicle and grasping the steering wheel. First, we describe how partial point clouds are acquired from different perspectives using sensors, including a stereo camera and a tilting laser range finder. For finer detail and a larger model than one sensor view alone can capture, a Kinect Fusion (Izadi et al. in KinectFusion: realtime 3D reconstruction and interaction using a moving depth camera, 2011)-like algorithm is used to integrate the stereo point clouds as the sensor head is moved around the vehicle. Second, we discuss how individual sensor views can be registered to the overall vehicle model to provide context, and present methods to estimate both statically and dynamically several geometric parameters critical to motion planning: (1) the floor height and boundaries defined by the seat and the dashboard, and (2) the steering wheel pose and dimensions. Results are compared using the different sensors, and the usefulness of the estimated quantities for motion planning is also addressed.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.2
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据