4.6 Article

Characterising the area under the curve loss function landscape

期刊

出版社

IOP Publishing Ltd
DOI: 10.1088/2632-2153/ac49a9

关键词

area under the curve; loss function landscape; basin hopping; alternative loss function; loss function

资金

  1. Downing College, Cambridge
  2. Voellm-Hruska PhD studentship
  3. EPSRC
  4. International Chair at the Interdisciplinary Institute for Artificial Intelligence at 3iA Cote d'Azur - French government [ANR-19-P3IA-0002]

向作者/读者索取更多资源

This paper compares the use of cross-entropy loss and direct optimization of AUC for evaluating neural network classifiers. It analyzes the characteristics of approximate AUC loss functions and provides a theoretical explanation. The research findings show that the approximate AUC loss function can improve testing AUC, but its minima are less stable.
One of the most common metrics to evaluate neural network classifiers is the area under the receiver operating characteristic curve (AUC). However, optimisation of the AUC as the loss function during network training is not a standard procedure. Here we compare minimising the cross-entropy (CE) loss and optimising the AUC directly. In particular, we analyse the loss function landscape (LFL) of approximate AUC (appAUC) loss functions to discover the organisation of this solution space. We discuss various surrogates for AUC approximation and show their differences. We find that the characteristics of the appAUC landscape are significantly different from the CE landscape. The approximate AUC loss function improves testing AUC, and the appAUC landscape has substantially more minima, but these minima are less robust, with larger average Hessian eigenvalues. We provide a theoretical foundation to explain these results. To generalise our results, we lastly provide an overview of how the LFL can help to guide loss function analysis and selection.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据