4.7 Article

Deep attention aware feature learning for person re-Identification

Journal

PATTERN RECOGNITION
Volume 126, Issue -, Pages -

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2022.108567

Keywords

Person re-identification; Attention learning; Multi-task learning

Funding

  1. National Natural Science Foundation of China [61876180, U2013202, 62076026, 61973029]
  2. Beijing Natural Science Foundation [4202073]
  3. Fundamental Research Funds for the Central Universities [FRF-TP-20-08B]
  4. Guangdong Basic and Applied Basic Research Foundation [2020B1515120050]

Ask authors/readers for more resources

This paper proposes a method to predict attention maps in person ReID networks, which includes global and local attention maps to focus on persons and relevant body parts. This method improves the performance of person re-identification.
A B S T R A C T Visual attention has proven to be effective in improving the performance of person re-identification. Most existing methods apply visual attention heuristically by learning an additional attention map to re-weight the feature maps for person re-identification, however, this kind of methods inevitably increase the model complexity and inference time. In this paper, we propose to incorporate the ability of predicting attention maps as additional objectives in a person ReID network without changing the original structure, thus maintain the same inference time and model size. Two kinds of attention maps have been considered to make the learned feature maps being aware of the person and related body parts respectively. Globally, a holistic attention branch (HAB) is proposed to make the feature maps obtained by backbone could focus on persons so as to alleviate the influence of background. Locally, a partial attention branch (PAB) is proposed to make the extracted features can be decoupled into several groups that are separately responsible for different body parts, thus increasing the robustness to pose variation and partial occlusion. These two kinds of attentions are universal and can be incorporated into existing ReID networks. We have tested its performance on two typical networks (TriNet [1] and Bag of Tricks [2]) and observed significant performance improvement on five widely used datasets. (c) 2022 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available