期刊
2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022
卷 -, 期 -, 页码 2806-2815出版社
IEEE
DOI: 10.1109/CVPRW56347.2022.00318
关键词
-
Most popular metric learning losses are not directly related to the evaluation metrics used to assess their performance. However, training a metric learning model by maximizing the area under the ROC curve can induce a suitable implicit ranking for retrieval problems. By proposing an approximated and derivable AUC loss, state-of-the-art performance is achieved on large scale retrieval benchmark datasets.
Most popular metric learning losses have no direct relation with the evaluation metrics that are subsequently applied to evaluate their performance. We hypothesize that training a metric learning model by maximizing the area under the ROC curve (which is a typical performance measure of recognition systems) can induce an implicit ranking suitable for retrieval problems. This hypothesis is supported by previous work that proved that a curve dominates in ROC space if and only if it dominates in Precision-Recall space. To test this hypothesis, we design and maximize an approximated, derivable relaxation of the area under the ROC curve. The proposed AUC loss achieves state-of-the-art results on two large scale retrieval benchmark datasets (Stanford Online Products and DeepFashion In-Shop). Moreover, the AUC loss achieves comparable performance to more complex, domain specific, state-of-the-art methods for vehicle re-identification.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据