4.7 Article

Recurrent Semantic Preserving Generation for Action Prediction

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCSVT.2020.2975065

关键词

Action prediction; deep reinforcement learning; generative adversarial network; skeleton based action

资金

  1. National Key Research and Development Program of China [2017YFA0700802]
  2. National Natural Science Foundation of China [61822603, U1813218, U1713214, 61672306, 91746107]

向作者/读者索取更多资源

In this paper, a recurrent semantic preserving generation method is proposed for action prediction, which achieves competitive performance in experimental results.
In this paper, we propose a recurrent semantic preserving generation (RSPG) method for action prediction. Unlike most existing methods which don't make full use of information from partially observed sequences, we develop a generation architecture to complement the sequence of skeletons for predicting the action, which can exploit more potential information of the movement tendency. Our method learns to capture the tendency of observed sequences and complement the subsequent action with adversarial learning under some constrains, which preserves the consistency between the generation sequence and the observed sequence. By generating the subsequent action, our method can predict the action with the most probability. Moreover, the redundant generation introduces the noise and disturbs the prediction. The insufficient generation cannot exploit the potential information for improving the effect of predicting the action. Our RSPG controls the generation step in a recurrent manner for maximizing the discriminative information of actions, which can adapt to the variable length of different actions. We evaluate our method on four popular action datasets: NTU, UCF101, BIT, and UT-Interaction, and experimental results show that our method achieves very competitive performance with the state-of-the-art.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据