3.8 Proceedings Paper

Area Under the ROC Curve Maximization for Metric Learning

出版社

IEEE
DOI: 10.1109/CVPRW56347.2022.00318

关键词

-

向作者/读者索取更多资源

Most popular metric learning losses are not directly related to the evaluation metrics used to assess their performance. However, training a metric learning model by maximizing the area under the ROC curve can induce a suitable implicit ranking for retrieval problems. By proposing an approximated and derivable AUC loss, state-of-the-art performance is achieved on large scale retrieval benchmark datasets.
Most popular metric learning losses have no direct relation with the evaluation metrics that are subsequently applied to evaluate their performance. We hypothesize that training a metric learning model by maximizing the area under the ROC curve (which is a typical performance measure of recognition systems) can induce an implicit ranking suitable for retrieval problems. This hypothesis is supported by previous work that proved that a curve dominates in ROC space if and only if it dominates in Precision-Recall space. To test this hypothesis, we design and maximize an approximated, derivable relaxation of the area under the ROC curve. The proposed AUC loss achieves state-of-the-art results on two large scale retrieval benchmark datasets (Stanford Online Products and DeepFashion In-Shop). Moreover, the AUC loss achieves comparable performance to more complex, domain specific, state-of-the-art methods for vehicle re-identification.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据