Journal
PATTERN RECOGNITION
Volume 44, Issue 6, Pages 1296-1311Publisher
ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2010.11.022
Keywords
Background subtraction; Object detection; Statistical reach feature (SRF); Grayscale arranging pairs (GAP)
Ask authors/readers for more resources
In this paper, we propose a robust and accurate background model, called grayscale arranging pairs (GAP). The model is based on the statistical reach feature (SRF), which is defined as a set of statistical pair-wise features. Using the GAP model, moving objects are successfully detected under a variety of complex environmental conditions. The main concept of the proposed method is the use of multiple point pairs that exhibit a stable statistical intensity relationship as a background model. The intensity difference between pixels of the pair is much more stable than the intensity of a single pixel, especially in varying environments. Our proposed method focuses more on the history of global spatial correlations between pixels than on the history of any given pixel or local spatial correlations. Furthermore, we clarify how to reduce the GAP modeling time and present experimental results comparing GAP with existing object detection methods, demonstrating that superior object detection with higher precision and recall rates is achieved by GAP. (C) 2010 Elsevier Ltd. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available