4.7 Article

Exploring trace transform for robust human action recognition

期刊

PATTERN RECOGNITION
卷 46, 期 12, 页码 3238-3248

出版社

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2013.06.006

关键词

Human action recognition; Motion analysis; Action classification; Trace transform

资金

  1. ICT-Project Siren [FP7-ICT-258453]
  2. European Union (European Social Fund ESF)
  3. Greek national funds through the Operational Program Education and Lifelong Learning of the National Strategic Reference Framework (NSRF)

向作者/读者索取更多资源

Machine based human action recognition has become very popular in the last decade. Automatic unattended surveillance systems, interactive video games, machine learning and robotics are only few of the areas that involve human action recognition. This paper examines the capability of a known transform, the so-called Trace, for human action recognition and proposes two new feature extraction methods based on the specific transform. The first method extracts Trace transforms from binarized silhouettes, representing different stages of a single action period. A final history template composed from the above transforms, represents the whole sequence containing much of the valuable spatiotemporal information contained in a human action. The second, involves Trace for the construction of a set of invariant features that represent the action sequence and can cope with variations usually appeared in video capturing. The specific method takes advantage of the natural specifications of the Trace transform, to produce noise robust features that are invariant to translation, rotation, scaling and are effective, simple and fast to create. Classification experiments performed on two well known and challenging action datasets (KTH and Weizmann) using Radial Basis Function (RBF) Kernel SVM provided very competitive results indicating the potentials of the proposed techniques. (C) 2013 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据