4.5 Article

Person re-identification in crowd

Journal

PATTERN RECOGNITION LETTERS
Volume 33, Issue 14, Pages 1828-1837

Publisher

ELSEVIER
DOI: 10.1016/j.patrec.2012.02.014

Keywords

Person re-identification; Non-overlapping cameras; People in crowd; Trajectory propagation; Appearance features; London Gatwick airport dataset

Funding

  1. Erasmus Mundus Joint Doctorate in Interactive and Cognitive Environments
  2. Education, Audiovisual & Culture Executive Agency (FPA) [2010-0012]

Ask authors/readers for more resources

Person re-identification aims to recognize the same person viewed by disjoint cameras at different time instants and locations. In this paper, after an extensive review of state-of-the-art approaches, we propose a re-identification method that takes into account the appearance of people, the spatial location of cameras and potential paths a person can choose to follow. This choice is modeled with a set of areas of interest (landmarks) that constrain the propagation of people trajectories in non-observed regions between the field-of-view of cameras. We represent people with a selective patch around their upper body to work in crowded scenes when occlusions are frequent. We demonstrate the proposed method in a challenging scenario from London Gatwick airport and compare it to well-known person re-identification methods. highlighting their strengths and limitations. Finally, we show by Cumulative Matching Characteristic curve that the best performance results by modeling people movements in non-observed regions combined with appearance methods, achieving an average improvement of 6% when only appearance is used and 15% when only motion is used for the association of people across cameras. (c) 2012 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available