4.6 Article

Fusion of Multiple Person Re-id Methods With Model and Data-Aware Abilities

期刊

IEEE TRANSACTIONS ON CYBERNETICS
卷 50, 期 2, 页码 561-571

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCYB.2018.2869739

关键词

Measurement; Task analysis; Robustness; Learning systems; Visualization; Benchmark testing; Computational modeling; Fusion; generative model of labels; abilities; difficulties (GLAD); person reidentification (person re-id)

向作者/读者索取更多资源

Person re-identification (person re-id) has attracted rapidly increasing attention in computer vision and pattern recognition research community in recent years. With the goal of providing match ranking results between each query person image and the gallery ones, the person re-id technique has been widely explored and a large number of person re-id methods have been developed. As these algorithms leverage different kinds of prior assumptions, image features, distance matching functions, et al., each of them has its own strengths and weaknesses. Inspired by these facts, this paper proposes a novel person re-id method based on the idea of inferring superior fusion results from a variety of previous base person re-id algorithms using different methodologies or features. To achieve this goal, we propose a novel framework which mainly consists of two steps: 1) a number of existing person re-id methods are implemented, and the ranking results are obtained in the test datasets. and 2) the robust fusion strategy is applied to obtain better re-ranked matching results by simultaneously considering the recognition abilities of various base re-id methods and the difficulties of different gallery person images to be correctly recognized under the generative model of labels, abilities, and difficulties framework. Comprehensive experiments show the effectiveness of our proposed method, and we have received state-of-the-art results on recent popular person re-id datasets.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据