4.7 Article

Dual-branch adaptive attention transformer for occluded person re-identification

Journal

IMAGE AND VISION COMPUTING
Volume 131, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.imavis.2023.104633

Keywords

Person re-identification; Multi-headed self-attention; Transformer; Metric learning

Ask authors/readers for more resources

To address the problem of occlusion in person re-identification, an end-to-end dual-branch Transformer network is designed. The network utilizes a global branch and a local branch to extract global and local features. Experimental results demonstrate that the proposed method achieves state-of-the-art performance on four challenging datasets.
Occluded person re-identification is still a common and challenging task because people are often occluded by some obstacles (e.g. cars and trees) in the real world. In order to locate the unoccluded parts and extract local fine-grained features of the occluded human body, State-of-the-Art (SOTA) methods usually use a pose estima-tion model, which usually causes additional bias and this two-stage architecture also complicates the model. To solve this problem, an end-to-end dual-branch Transformer network for occluded person re-identification is designed. Specifically, one of the branches is the transformer-based global branch, which is responsible for extracting global features, while in the other local branch, we design the Selective Token Attention (STA) module. STA can utilize the multi-headed self-attention mechanism to select discriminating tokens for effectively extracting the local features. Further, in order to alleviate the inconsistency between Softmax Loss and Triplet Loss convergence goals, Circle Loss is introduced to design the Goal Consistency Loss (GC Loss) to supervise the network. Experiments on four challenging datasets for Re-ID tasks (including occluded person Re-ID and holistic person Re-ID) illustrate that our method can achieve SOTA performance. (c) 2023 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available