Journal
KNOWLEDGE-BASED SYSTEMS
Volume 252, Issue -, Pages -Publisher
ELSEVIER
DOI: 10.1016/j.knosys.2022.109354
Keywords
Person re -identification; Domain adaptation; Semantic driven attention; Attribute learning
Categories
Funding
- National Natural Science Foundation of China [61773262, 62006152]
- Aeronautical Science Fundation of China [20142057006]
Ask authors/readers for more resources
Unsupervised domain adaptation for person re-identification addresses the challenges of cross-domain background variations and insufficient foreground identification. This paper presents a novel body structure estimation mechanism and an enforced semantic driven attention network, which improve feature representations through optimized attribute learning techniques.
Unsupervised domain adaptation (UDA) person re-identification (re-ID) aims to transfer knowledge from a labeled source domain to guide the task proposed on the unlabeled target domain, in which people share different identifications and cross multiple camera views within two different domains. Consequently, traditional UDA re-ID techniques generally suffer due to the negative transfer caused by the inevitable noise generated by variant backgrounds, while the foregrounds also lack sufficient reliable identification knowledge to guarantee the qualified cross-domain re-ID. To remedy the raised negative transfer caused by variant backgrounds, we propose a novel body structure estimation (BSE) mechanism enforced semantic driven attention network (SDA), which enables the designed model with semantic effectiveness to distinguish the foreground and background. In searching for the reliable feature representations as in the foreground areas, we propose a novel label refinery mechanism to dynamically optimize the traditional attribute learning techniques for the strengthened personal attribute features and thus resulting the qualified UDA-re-ID. Extensive experiments demonstrate the effectiveness of our method in solving unsupervised domain adaptation person re-ID task on three large-scale datasets including Market-1501, DukeMTMC-reID and MSMT17. (C) 2022 Elsevier B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available