4.6 Article

An All-Batch Loss for Constructing Prediction Intervals

期刊

APPLIED SCIENCES-BASEL
卷 11, 期 4, 页码 -

出版社

MDPI
DOI: 10.3390/app11041728

关键词

feedforward neural network; prediction interval; uncertainty quantification; loss function

资金

  1. National Key Research and Development Program of China [2017YFC0403701]

向作者/读者索取更多资源

The paper introduces an all-batch loss function for constructing high-quality prediction intervals, which can be trained using the gradient descent method for various sample batch sizes. A high-quality prediction interval generation framework is proposed using dual feedforward neural networks structure, adaptable to regression analysis and other problems.
The prediction interval (PI) is an important research topic in reliability analyses and decision support systems. Data size and computation costs are two of the issues which may hamper the construction of PIs. This paper proposes an all-batch (AB) loss function for constructing high quality PIs. Taking the full advantage of the likelihood principle, the proposed loss makes it possible to train PI generation models using the gradient descent (GD) method for both small and large batches of samples. With the structure of dual feedforward neural networks (FNNs), a high-quality PI generation framework is introduced, which can be adapted to a variety of problems including regression analysis. Numerical experiments were conducted on the benchmark datasets; the results show that higher-quality PIs were achieved using the proposed scheme. Its reliability and stability were also verified in comparison with various state-of-the-art PI construction methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据