4.6 Article

Human Action Recognition by Learning Spatio-Temporal Features With Deep Neural Networks

期刊

IEEE ACCESS
卷 6, 期 -, 页码 17913-17922

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2018.2817253

关键词

Artificial intelligent; human action recognition; attention model; deep neural networks; robotic system

资金

  1. National Natural Science Foundation of China [U1713213, 61772508, 61762014, 61775175, 61673192]
  2. Guangdong Natural Science Funds [2014A030310252]
  3. CAS Key Technology Talent Program
  4. Shenzhen Technology Project [JCYJ20170413152535587, JSGG20160331185256983, JSGG20160229115709109, JCYJ 20170307164023599]
  5. Guangdong Technology Program [2016B010108010, 2016B010125003]
  6. Key Laboratory of Human-Machine Intelligence-Synergy Systems, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences [2014DP173025]
  7. Shenzhen Engineering Laboratory for 3D Content Generating Technologies [[2017] 476]

向作者/读者索取更多资源

Human action recognition is one of the fundamental challenges in robotics systems. In this paper, we propose one lightweight action recognition architecture based on deep neural networks just using RGB data. The proposed architecture consists of convolution neural network (CNN), long short-term memory (LSTM) units, and temporal-wise attention model. First, the CNN is used to extract spatial features to distinguish objects from the background with both local and semantic characteristics. Second, two kinds of LSTM networks are performed on the spatial feature maps of different CNN layers (pooling layer and fully-connected layer) to extract temporal motion features. Then, one temporal-wise attention model is designed after the LSTM to learn which parts in which frames are more important. Lastly, a joint optimization module is designed to explore intrinsic relations between two kinds of LSTM features. Experimental results demonstrate the efficiency of the proposed method.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据