4.6 Article

CrossFuNet: RGB and Depth Cross-Fusion Network for Hand Pose Estimation

Journal

SENSORS
Volume 21, Issue 18, Pages -

Publisher

MDPI
DOI: 10.3390/s21186095

Keywords

hand pose estimation; convolutional neural network; RGBD fusion

Funding

  1. Program of the Shanghai Normal University [309-C-9000-20-309119]

Ask authors/readers for more resources

In this paper, a novel RGB and depth information fusion network called CrossFuNet is proposed to improve the accuracy of 3D hand pose estimation. By inputting the RGB image and paired depth map into two separate subnetworks and combining feature maps in the fusion module using a new approach to merge information from both modalities, accurate 3D hand pose estimation is achieved. The model is validated on two public datasets and outperforms state-of-the-art methods.
Despite recent successes in hand pose estimation from RGB images or depth maps, inherent challenges remain. RGB-based methods suffer from heavy self-occlusions and depth ambiguity. Depth sensors rely heavily on distance and can only be used indoors, thus there are many limitations to the practical application of depth-based methods. The aforementioned challenges have inspired us to combine the two modalities to offset the shortcomings of the other. In this paper, we propose a novel RGB and depth information fusion network to improve the accuracy of 3D hand pose estimation, which is called CrossFuNet. Specifically, the RGB image and the paired depth map are input into two different subnetworks, respectively. The feature maps are fused in the fusion module in which we propose a completely new approach to combine the information from the two modalities. Then, the common method is used to regress the 3D key-points by heatmaps. We validate our model on two public datasets and the results reveal that our model outperforms the state-of-the-art methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available