4.5 Article

RGB-IR cross-modality person ReID based on teacher-student GAN model

Journal

PATTERN RECOGNITION LETTERS
Volume 150, Issue -, Pages 155-161

Publisher

ELSEVIER
DOI: 10.1016/j.patrec.2021.07.006

Keywords

Person ReID; Cross-modality; Teacher-student model

Ask authors/readers for more resources

RGB-Infrared (RGB-IR) person re-identification (ReID) technology aims to address the challenge of cross-modality gap in features by using a Teacher-Student GAN model (TS-GAN) to guide the ReID backbone network. The proposed model only requires the backbone module at the test stage, making it more efficient and resource-saving.
RGB-Infrared (RGB-IR) person re-identification (ReID) is a technology where the system can automatically identify the same person appearing at different parts of a video when light is unavailable. The critical challenge of this task is the cross-modality gap of features under different modalities. To solve this challenge, we proposed a Teacher-Student GAN model (TS-GAN) to adopt different domains and guide the ReID backbone. (1) In order to get corresponding RGB-IR image pairs, the RGB-IR Generative Adversarial Network (GAN) was used to generate IR images. (2) To kick-start the training of identities, a ReID Teacher module was trained under IR modality person images, which is then used to guide its Student counterpart in training. (3) Likewise, to better adapt different domain features and enhance model ReID performance, three Teacher-Student loss functions were used. Unlike other GAN based models, the proposed model only needs the backbone module at the test stage, making it more efficient and resource-saving. To showcase our model's capability, we did extensive experiments on the newly-released SYSU-MM01 and RegDB RGB-IR Re-ID benchmark and achieved superior performance to the state-of-the-art with 47.4% mAP and 69.4% mAP respectively. (C) 2021 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available