4.8 Article

Discriminative Video Pattern Search for Efficient Action Detection

出版社

IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2011.38

关键词

Video pattern search; action detection; spatiotemporal branch-and-bound search

资金

  1. Nanyang Assistant Professorship
  2. US National Science Foundation [IIS-0347877, IIS-0916607]
  3. US Army Research Laboratory
  4. US Army Research Office [ARO W911NF-08-1-0504]
  5. Div Of Information & Intelligent Systems
  6. Direct For Computer & Info Scie & Enginr [0916607] Funding Source: National Science Foundation

向作者/读者索取更多资源

Actions are spatiotemporal patterns. Similar to the sliding window-based object detection, action detection finds the reoccurrences of such spatiotemporal patterns through pattern matching, by handling cluttered and dynamic backgrounds and other types of action variations. We address two critical issues in pattern matching-based action detection: 1) the intrapattern variations in actions, and 2) the computational efficiency in performing action pattern search in cluttered scenes. First, we propose a discriminative pattern matching criterion for action classification, called naive Bayes mutual information maximization (NBMIM). Each action is characterized by a collection of spatiotemporal invariant features and we match it with an action class by measuring the mutual information between them. Based on this matching criterion, action detection is to localize a subvolume in the volumetric video space that has the maximum mutual information toward a specific action class. A novel spatiotemporal branch-and-bound (STBB) search algorithm is designed to efficiently find the optimal solution. Our proposed action detection method does not rely on the results of human detection, tracking, or background subtraction. It can handle action variations such as performing speed and style variations as well as scale changes well. It is also insensitive to dynamic and cluttered backgrounds and even to partial occlusions. The cross-data set experiments on action detection, including KTH, CMU action data sets, and another new MSR action data set, demonstrate the effectiveness and efficiency of the proposed multiclass multiple-instance action detection method.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据