4.7 Article

Towards efficient and objective work sampling: Recognizing workers' activities in site surveillance videos with two-stream convolutional networks

Journal

AUTOMATION IN CONSTRUCTION
Volume 94, Issue -, Pages 360-370

Publisher

ELSEVIER
DOI: 10.1016/j.autcon.2018.07.011

Keywords

Labor productivity evaluation; Work sampling; Two-stream convolutional networks; Activity recognition

Funding

  1. Innovation and Technology Commission (ITC) of Hong Kong [ITP/036/12LP]
  2. Research Grants Council of Hong Kong [PolyU 152093/14E]

Ask authors/readers for more resources

Capturing the working states of workers on foot allows managers to precisely quantify and benchmark labor productivity, which in turn enables them to evaluate productivity losses and identify causes. Work sampling is a widely used method for this task, while suffers from low efficiency as only one worker is selected for each observation. Attentional selection asymmetry can also bias its uniform object selection assumption. Existing vision-based methods are primarily oriented towards recognizing single, separated activities involving few workers or equipment. In this paper, we introduce an activity recognition method, which receives surveillance videos as input and produces diverse and continuous activity labels of individual workers in the field of view. Convolutional networks are used to recognize activities, which are encoded in spatial and temporal streams. A new fusion strategy is developed to combine the recognition results of the two streams. The experimental results show that our activity recognition method has achieved an average accuracy of 80.5%, which is comparable with the state-of-the-art of activity recognition in the computer vision community, given the severe camera motion and low resolution of site surveillance videos and the marginal inter-class difference and significant infra-class variation of workers' activities. We also demonstrate that our method can underpin the implementation of efficient and objective work sampling. The training and test datasets of the study are publicly available.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available