4.6 Article

JAC-Net: Joint learning with adaptive exploration and concise attention for unsupervised domain adaptive person re-identification

Journal

NEUROCOMPUTING
Volume 483, Issue -, Pages 262-274

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2022.02.010

Keywords

Person re-identification; Domain adaptation; Joint learning; Adaptive exploration; Concise attention

Funding

  1. Natural Science Foundation of China [61806071]
  2. Natural Science Foundation of Hebei Province [F2019202381, F2019202464]

Ask authors/readers for more resources

This paper proposes a method called JAC-Net for person re-identification in an unlabeled target domain. It uses clustering to generate pseudo-labels and optimizes the training network using a joint learning network and a concise attention module. Experiments show that JAC-Net achieves good performance on multiple datasets and reaches a similar level to supervised learning methods.
Existing unsupervised domain adaptive (UDA) methods of person re-identification (re-ID) often use clus-tering to generate and optimize pseudo-labels. However, the pseudo-labels generated in this way contain noise, which is gradually amplified during the iterative process, leading to a lower recognition accuracy than for supervised methods. This paper proposes Joint Learning with Adaptive Exploration and Concise Attention Network (JAC-Net), which uses two identical networks to optimize person re-ID in an unlabeled target domain. Based on the pseudo-labels generated by clustering, JAC-Net optimizes the training net-work by combining a joint learning network (JLN) with a concise attention module (CAM). Inspired by the teacher-student network, JLN uses two identical networks to share knowledge for network learning, and also applies adaptive exploration learning strategies to automatically assign weights to the two iden-tical networks and to balance the impact of the knowledge from the two networks. As a parameter-free attention module, a CAM is added to the feature map extracted in specific layers of ResNet50 without changing the high-order semantic features. Extensive experiments on the Market-1501, DukeMTMC-reID and MSMT17 datasets show that JAC-Net achieves well performance and reaches a similar level of supervised learning. (C) 2022 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available