4.7 Article

Quo Vadis, Skeleton Action Recognition?

Journal

INTERNATIONAL JOURNAL OF COMPUTER VISION
Volume 129, Issue 7, Pages 2097-2112

Publisher

SPRINGER
DOI: 10.1007/s11263-021-01470-y

Keywords

Human action recognition; Human activity recognition; Skeleton; 3-D human pose; Deep learning

Funding

  1. MeitY, Government of India

Ask authors/readers for more resources

This paper explores current and upcoming frontiers in skeleton-based human action recognition by introducing new datasets and benchmarking state-of-the-art models. The study reveals the challenges and domain gaps induced by actions in the wild through benchmarking top performers of NTU-120 on newly introduced datasets.
In this paper, we study current and upcoming frontiers across the landscape of skeleton-based human action recognition. To study skeleton-action recognition in the wild, we introduce Skeletics-152, a curated and 3-D pose-annotated subset of RGB videos sourced from Kinetics-700, a large-scale action dataset. We extend our study to include out-of-context actions by introducing Skeleton-Mimetics, a dataset derived from the recently introduced Mimetics dataset. We also introduce Metaphorics, a dataset with caption-style annotated YouTube videos of the popular social game Dumb Charades and interpretative dance performances. We benchmark state-of-the-art models on the NTU-120 dataset and provide multi-layered assessment of the results. The results from benchmarking the top performers of NTU-120 on the newly introduced datasets reveal the challenges and domain gap induced by actions in the wild. Overall, our work characterizes the strengths and limitations of existing approaches and datasets. Via the introduced datasets, our work enables new frontiers for human action recognition.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available