4.7 Article

Machine-Learning Methods on Noisy and Sparse Data

期刊

MATHEMATICS
卷 11, 期 1, 页码 -

出版社

MDPI
DOI: 10.3390/math11010236

关键词

machine learning; deep neural networks; MARS; splines; interpolation; feedforward neural networks; noisy data; sparse data

向作者/读者索取更多资源

This study compares machine-learning methods and cubic splines on their ability to handle sparse and noisy training data. The results show that cubic splines provide more precise interpolation than deep neural networks and multivariate adaptive regression splines with very sparse data. However, machine-learning models show robustness to noise and can outperform splines after reaching a threshold of training data. The study aims to provide a general framework for interpolating one-dimensional signals, often obtained from complex scientific simulations or laboratory experiments.
Experimental and computational data and field data obtained from measurements are often sparse and noisy. Consequently, interpolating unknown functions under these restrictions to provide accurate predictions is very challenging. This study compares machine-learning methods and cubic splines on the sparsity of training data they can handle, especially when training samples are noisy. We compare deviation from a true function f using the mean square error, signal-to-noise ratio and the Pearson R2 coefficient. We show that, given very sparse data, cubic splines constitute a more precise interpolation method than deep neural networks and multivariate adaptive regression splines. In contrast, machine-learning models are robust to noise and can outperform splines after a training data threshold is met. Our study aims to provide a general framework for interpolating one-dimensional signals, often the result of complex scientific simulations or laboratory experiments.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据