4.5 Article

Learning from Imbalanced Data Sets with Weighted Cross-Entropy Function

期刊

NEURAL PROCESSING LETTERS
卷 50, 期 2, 页码 1937-1949

出版社

SPRINGER
DOI: 10.1007/s11063-018-09977-1

关键词

Multilayer perceptron; Imbalanced data; Classification problem; Back-propagation; Cost-sensitive function

资金

  1. CNPq
  2. FAPEMIG
  3. CAPES

向作者/读者索取更多资源

This paper presents a novel approach to deal with the imbalanced data set problem in neural networks by incorporating prior probabilities into a cost-sensitive cross-entropy error function. Several classical benchmarks were tested for performance evaluation using different metrics, namely G-Mean, area under the ROC curve (AUC), adjusted G-Mean, Accuracy, True Positive Rate, True Negative Rate and F1-score. The obtained results were compared to well-known algorithms and showed the effectiveness and robustness of the proposed approach, which results in well-balanced classifiers given different imbalance scenarios.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据