期刊
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH
卷 36, 期 3, 页码 261-268出版社
SAGE PUBLICATIONS LTD
DOI: 10.1177/0278364917700714
关键词
Benchmarking; manipulation; grasping; simulation
类别
资金
- National Science Foundation [IIS- 0953856, IIS-1139078, IIS-1317976]
- Direct For Computer & Info Scie & Enginr
- Div Of Information & Intelligent Systems [1317976] Funding Source: National Science Foundation
In this paper, we present an image and model dataset of the real-life objects from the Yale-CMU-Berkeley Object Set, which is specifically designed for benchmarking in manipulation research. For each object, the dataset presents 600 high-resolution RGB images, 600 RGB-D images and five sets of textured three-dimensional geometric models. Segmentation masks and calibration information for each image are also provided. These data are acquired using the BigBIRD Object Scanning Rig and Google Scanners. Together with the dataset, Python scripts and a Robot Operating System node are provided to download the data, generate point clouds and create Unified Robot Description Files. The dataset is also supported by our website, www.ycbbenchmarks.org, which serves as a portal for publishing and discussing test results along with proposing task protocols and benchmarks.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据