4.7 Article

UNOC: Understanding Occlusion for Embodied Presence in Virtual Reality

Journal

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/TVCG.2021.3085407

Keywords

Tracking; Cameras; Three-dimensional displays; Headphones; Kinematics; Videos; Optical sensors; Motion capture; machine learning; body tracking; embodied presence; virtual reality

Ask authors/readers for more resources

Tracking body and hand motions in 3D space is essential for social and self-presence in augmented and virtual environments. In this article, we propose a new data-driven framework for egocentric body tracking to overcome occlusion problems in optimization methods. We collect a large-scale motion capture dataset and simulate occlusion patterns to estimate occluded body parts using a deep neural network. Our method achieves high-fidelity poses in egocentric scenarios.
Tracking body and hand motions in 3D space is essential for social and self-presence in augmented and virtual environments. Unlike the popular 3D pose estimation setting, the problem is often formulated as egocentric tracking based on embodied perception (e.g., egocentric cameras, handheld sensors). In this article, we propose a new data-driven framework for egocentric body tracking, targeting challenges of omnipresent occlusions in optimization-based methods (e.g., inverse kinematics solvers). We first collect a large-scale motion capture dataset with both body and finger motions using optical markers and inertial sensors. This dataset focuses on social scenarios and captures ground truth poses under self-occlusions and body-hand interactions. We then simulate the occlusion patterns in head-mounted camera views on the captured ground truth using a ray casting algorithm and learn a deep neural network to infer the occluded body parts. Our experiments show that our method is able to generate high-fidelity embodied poses by applying the proposed method to the task of real-time egocentric body tracking, finger motion synthesis, and 3-point inverse kinematics.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available