4.7 Article

Effective and efficient human action recognition using dynamic frame skipping and trajectory rejection

期刊

IMAGE AND VISION COMPUTING
卷 58, 期 -, 页码 76-85

出版社

ELSEVIER
DOI: 10.1016/j.imavis.2016.06.002

关键词

Frame skipping; Human action recognition (HAR); Motion descriptor; Motion trajectory; Optical flow

资金

  1. ICT R&D program of MSIP/IITP [Development of global multi-target tracking and event prediction techniques based on real-time large-scale video analysis]. [B010116-0525]

向作者/读者索取更多资源

Human action recognition (HAR) is a core technology for human-computer interaction and video understanding, attracting significant research and development attention in the field of computer vision. However, in uncontrolled environments, achieving effective HAR is still challenging, due to the widely varying nature of video content. In previous research efforts, trajectory-based video representations have been widely used for HAR. Although these approaches show state-of-the-art HAR performance for various datasets, issues like a high computational complexity and the presence of redundant trajectories still need to be addressed in order to solve the problem of real-world HAR. In this paper, we propose a novel method for HAR, integrating a technique for rejecting redundant trajectories that are mainly originating from camera movement, without degrading the effectiveness of HAR. Furthermore, in order to facilitate efficient optical flow estimation prior to trajectory extraction, we integrate a technique for dynamic frame skipping. As a result, we only make use of a small subset of the frames present in a video clip for optical flow estimation. Comparative experiments with five publicly available human action datasets show that the proposed method outperforms state-of-the-art HAR approaches in terms of effectiveness, while simultaneously mitigating the computational complexity. (C) 2016 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据