4.7 Article

Dual-branch adaptive attention transformer for occluded person re-identification

期刊

IMAGE AND VISION COMPUTING
卷 131, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.imavis.2023.104633

关键词

Person re-identification; Multi-headed self-attention; Transformer; Metric learning

向作者/读者索取更多资源

To address the problem of occlusion in person re-identification, an end-to-end dual-branch Transformer network is designed. The network utilizes a global branch and a local branch to extract global and local features. Experimental results demonstrate that the proposed method achieves state-of-the-art performance on four challenging datasets.
Occluded person re-identification is still a common and challenging task because people are often occluded by some obstacles (e.g. cars and trees) in the real world. In order to locate the unoccluded parts and extract local fine-grained features of the occluded human body, State-of-the-Art (SOTA) methods usually use a pose estima-tion model, which usually causes additional bias and this two-stage architecture also complicates the model. To solve this problem, an end-to-end dual-branch Transformer network for occluded person re-identification is designed. Specifically, one of the branches is the transformer-based global branch, which is responsible for extracting global features, while in the other local branch, we design the Selective Token Attention (STA) module. STA can utilize the multi-headed self-attention mechanism to select discriminating tokens for effectively extracting the local features. Further, in order to alleviate the inconsistency between Softmax Loss and Triplet Loss convergence goals, Circle Loss is introduced to design the Goal Consistency Loss (GC Loss) to supervise the network. Experiments on four challenging datasets for Re-ID tasks (including occluded person Re-ID and holistic person Re-ID) illustrate that our method can achieve SOTA performance. (c) 2023 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据