3.8 Proceedings Paper

BEHAVE: Dataset and Method for Tracking Human Object Interactions

出版社

IEEE COMPUTER SOC
DOI: 10.1109/CVPR52688.2022.01547

关键词

-

资金

  1. Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) [409792180]
  2. German Federal Ministry of Education and Research (BMBF): Tubingen AI Center [FKZ: 01IS18039A]
  3. ERC Consolidator Grant 4DRepLy [770784]
  4. Machine Learning Cluster of Excellence, EXC [2064/1, 390727645]

向作者/读者索取更多资源

Modelling interactions between humans and objects in natural environments is crucial for various applications. The lack of a comprehensive dataset has hindered progress in this area. We introduce the BEHAVE dataset, the first dataset that includes multi-view RGBD frames, 3D SMPL and object fits, and annotated contacts between humans and objects. We use this dataset to develop a model for tracking human-object interactions in natural environments using a portable multi-camera setup.
Modelling interactions between humans and objects in natural environments is central to many applications including gaming, virtual and mixed reality, as well as human behavior analysis and human-robot collaboration. This challenging operation scenario requires generalization to vast number of objects, scenes, and human actions. Unfortunately, there exist no such dataset. Moreover, this data needs to be acquired in diverse natural environments, which rules out 4D scanners and marker based capture systems. We present BEHAVE dataset, the first full body human-object interaction dataset with multi-view RGBD frames and corresponding 3D SMPL and object fits along with the annotated contacts between them. We record similar to 15k frames at 5 locations with 8 subjects performing a wide range of interactions with 20 common objects. We use this data to learn a model that can jointly track humans and objects in natural environments with an easy-to-use portable multi-camera setup. Our key insight is to predict correspondences from the human and the object to a statistical body model to obtain human-object contacts during interactions. Our approach can record and track not just the humans and objects but also their interactions, modeled as surface contacts, in 3D. Our code and data can be found at: http://virtualhumans.mpi-inf.mpg.de/behave.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据