4.7 Article

Accurate 3D action recognition using learning on the Grassmann manifold

Journal

PATTERN RECOGNITION
Volume 48, Issue 2, Pages 556-567

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2014.08.011

Keywords

Human action recognition; Grassmann manifold; Observational latency; Depth images; Skeleton; Classification

Funding

  1. Direct For Computer & Info Scie & Enginr
  2. Div Of Information & Intelligent Systems [1217515] Funding Source: National Science Foundation
  3. Division of Computing and Communication Foundations
  4. Direct For Computer & Info Scie & Enginr [1319658] Funding Source: National Science Foundation

Ask authors/readers for more resources

In this paper we address the problem of modeling and analyzing human motion by focusing on 3D body skeletons. Particularly, our intent is to represent skeletal motion in a geometric and efficient way, leading to an accurate action-recognition system. Here an action is represented by a dynamical system whose observability matrix is characterized as an element of a Grassmann manifold. To formulate our learning algorithm, we propose two distinct ideas: (1) in the first one we perform classification using a Truncated Wrapped Gaussian model, one for each class in its own tangent space. (2) In the second one we propose a novel learning algorithm that uses a vector representation formed by concatenating local coordinates in tangent spaces associated with different classes and training a linear SVM. We evaluate our approaches on three public 3D action datasets: MSR-action 3D, UT-kinect and UCF-kinect datasets; these datasets represent different kinds of challenges and together help provide an exhaustive evaluation. The results show that our approaches either match or exceed state-of-the-art performance reaching 91.21% on MSR-action 3D, 97.91% on UCF-kinect, and 88.5% on UT-kinect. Finally, we evaluate the latency, i.e. the ability to recognize an action before its termination, of our approach and demonstrate improvements relative to other published approaches. (C)2014 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available