4.5 Article

ResT-ReID: Transformer block-based residual learning for person re-identification

Journal

PATTERN RECOGNITION LETTERS
Volume 157, Issue -, Pages 90-96

Publisher

ELSEVIER
DOI: 10.1016/j.patrec.2022.03.020

Keywords

Person re-identification; Vision transformer; Graph convolution networks; Self-attention strategy

Funding

  1. National Natural Science Foun-dation of China [6180 620 6, 62172417, 61973305, U1610124]
  2. Jiangsu Province Natural Science Foundation [BK20201346, BK20180639]
  3. Six Talent Climax Foundation of Jiangsu [2015-DZXX-010, 2018-XYDXX044]

Ask authors/readers for more resources

This paper proposes a hybrid backbone network ResT-ReID based on ResNet-50 and Transformer block for person re-identification. It replaces depth-wise convolution in ResNet-50 with global self-attention and utilizes attention-guided Graph Convolution Networks to fully exploit the knowledge of individuals. Experimental results demonstrate that ResT-ReID achieves competitive results in person re-identification task.
The Transformer has been applied into computer vision to explore long-range dependencies with multi -head self-attention strategy, therefore numerous Transformer-based methods for person re-identification (ReID) are designed for extracting effective as well as robust representation. However, the memory and computational complexity of scaled dot-product attention in Transformer cost vast overheads. To over-come these limitations, this paper presents ResT-ReID method, which designs a hybrid backbone Res-Transformer based on ResNet-50 and Transformer block for effective identify information. Specifically, we use global self-attention in place of depth-wise convolution in the fourth layer's residual bottleneck of ResNet-50. For fully exploiting the entire knowledge of the person, we devise attention-guided Graph Convolution Networks (GCNs) with side information embedding (SIE-AGCN), which has an attention layer located into two GCN layers. The quantified experiments on two large-scale ReID benchmarks demon-strate that the proposed ResT-ReID achieves competitive results compared with several state-of-the-art approaches. (c) 2022 Published by Elsevier B.V.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available