4.6 Article

Filtered pose graph for efficient kinect pose reconstruction

期刊

MULTIMEDIA TOOLS AND APPLICATIONS
卷 76, 期 3, 页码 4291-4312

出版社

SPRINGER
DOI: 10.1007/s11042-016-3546-4

关键词

Kinect; Pose reconstruction; Occlusion; Motion analysis

资金

  1. Cifre convention [N1222/2012]
  2. Faurecia Company
  3. Engineering and Physical Sciences Research Council (EPSRC) [EP/M002632/1]
  4. Engineering and Physical Sciences Research Council [EP/M002632/1] Funding Source: researchfish
  5. EPSRC [EP/M002632/1] Funding Source: UKRI

向作者/读者索取更多资源

Being marker-free and calibration free, Microsoft Kinect is nowadays widely used in many motion-based applications, such as user training for complex industrial tasks and ergonomics pose evaluation. The major problem of Kinect is the placement requirement to obtain accurate poses, as well as its weakness against occlusions. To improve the robustness of Kinect in interactive motion-based applications, real-time data-driven pose reconstruction has been proposed. The idea is to utilize a database of accurately captured human poses as a prior to optimize the Kinect recognized ones, in order to estimate the true poses performed by the user. The key research problem is to identify the most relevant poses in the database for accurate and efficient reconstruction. In this paper, we propose a new pose reconstruction method based on modelling the pose database with a structure called Filtered Pose Graph, which indicates the intrinsic correspondence between poses. Such a graph not only speeds up the database poses selection process, but also improves the relevance of the selected poses for higher quality reconstruction. We apply the proposed method in a challenging environment of industrial context that involves sub-optimal Kinect placement and a large amount of occlusion. Experimental results show that our real-time system reconstructs Kinect poses more accurately than existing methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据