4.8 Article

Quantifying Prediction Uncertainty in Regression Using Random Fuzzy Sets: The ENNreg Model

期刊

IEEE TRANSACTIONS ON FUZZY SYSTEMS
卷 31, 期 10, 页码 3690-3699

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TFUZZ.2023.3268200

关键词

Belief functions; Dempster-Shafer theory; evidence theory; machine learning; neural networks

向作者/读者索取更多资源

This article presents a neural network model for regression that quantifies prediction uncertainty using GRFNs. The model is competitive in terms of prediction accuracy and calibration error compared to other advanced techniques.
In this article, we introduce a neural network model for regression in which prediction uncertainty is quantified by Gaussian random fuzzy numbers (GRFNs), a newly introduced family of random fuzzy subsets of the real line that generalizes both Gaussian random variables and Gaussian possibility distributions. The output GRFN is constructed by combining GRFNs induced by prototypes using a combination operator that generalizes Dempster's rule of evidence theory. The three output units indicate the most plausible value of the response variable, variability around this value, and epistemic uncertainty. The network is trained by minimizing a loss function that generalizes the negative log-likelihood. Comparative experiments show that this method is competitive, both in terms of prediction accuracy and calibration error, with state-of-the-art techniques such as random forests or deep learning with Monte Carlo dropout. In addition, the model outputs a predictive belief function that can be shown to be calibrated, in the sense that it allows us to compute conservative prediction intervals with specified belief degree.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据