4.7 Article

Multi-type attributes driven multi-camera person re-identification

期刊

PATTERN RECOGNITION
卷 75, 期 -, 页码 77-89

出版社

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2017.07.005

关键词

Deep attributes; Person re-identification

资金

  1. National Science Foundation of China [61572050, 61672519, 91538111, 61620106009, 61429201]
  2. National 1000 Youth Talents Plan
  3. ARO [W911NF-15-1-0290]
  4. NEC Laboratory of America
  5. NEC Laboratory of Blippar

向作者/读者索取更多资源

One of the major challenges in person Re-Identification (ReID) is the inconsistent visual appearance of a person. Current works on visual feature and distance metric learning have achieved significant achievements, but still suffer from the limited robustness to pose variations, viewpoint changes, etc., and the high computational complexity. This makes person ReID among multiple cameras still challenging. This work is motivated to learn mid-level human attributes which are robust to visual appearance variations and could be used as efficient features for person matching. We propose a weakly supervised multi-type attribute learning framework which considers the contextual cues among attributes and progressively boosts the accuracy of attributes only using a limited number of labeled data. Specifically, this frame-work involves a three-stage training. A deep Convolutional Neural Network (dCNN) is first trained on an independent dataset labeled with attributes. Then it is fine-tuned on another dataset only labeled with person IDs using our defined triplet loss. Finally, the updated dCNN predicts attribute labels for the target dataset, which is combined with the independent dataset for the final round of fine-tuning. The predicted attributes, namely deep attributes exhibit promising generalization ability across different datasets. By directly using the deep attributes with simple Cosine distance, we have obtained competitive accuracy on four person ReID datasets. Experiments also show that a simple distance metric learning modular further boosts our method, making it outperform many recent works. (C) 2017 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据