4.7 Article

Yale-CMU-Berkeley dataset for robotic manipulation research

期刊

INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH
卷 36, 期 3, 页码 261-268

出版社

SAGE PUBLICATIONS LTD
DOI: 10.1177/0278364917700714

关键词

Benchmarking; manipulation; grasping; simulation

类别

资金

  1. National Science Foundation [IIS- 0953856, IIS-1139078, IIS-1317976]
  2. Direct For Computer & Info Scie & Enginr
  3. Div Of Information & Intelligent Systems [1317976] Funding Source: National Science Foundation

向作者/读者索取更多资源

In this paper, we present an image and model dataset of the real-life objects from the Yale-CMU-Berkeley Object Set, which is specifically designed for benchmarking in manipulation research. For each object, the dataset presents 600 high-resolution RGB images, 600 RGB-D images and five sets of textured three-dimensional geometric models. Segmentation masks and calibration information for each image are also provided. These data are acquired using the BigBIRD Object Scanning Rig and Google Scanners. Together with the dataset, Python scripts and a Robot Operating System node are provided to download the data, generate point clouds and create Unified Robot Description Files. The dataset is also supported by our website, www.ycbbenchmarks.org, which serves as a portal for publishing and discussing test results along with proposing task protocols and benchmarks.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据