Journal
MATHEMATICS
Volume 11, Issue 1, Pages -Publisher
MDPI
DOI: 10.3390/math11010236
Keywords
machine learning; deep neural networks; MARS; splines; interpolation; feedforward neural networks; noisy data; sparse data
Categories
Ask authors/readers for more resources
This study compares machine-learning methods and cubic splines on their ability to handle sparse and noisy training data. The results show that cubic splines provide more precise interpolation than deep neural networks and multivariate adaptive regression splines with very sparse data. However, machine-learning models show robustness to noise and can outperform splines after reaching a threshold of training data. The study aims to provide a general framework for interpolating one-dimensional signals, often obtained from complex scientific simulations or laboratory experiments.
Experimental and computational data and field data obtained from measurements are often sparse and noisy. Consequently, interpolating unknown functions under these restrictions to provide accurate predictions is very challenging. This study compares machine-learning methods and cubic splines on the sparsity of training data they can handle, especially when training samples are noisy. We compare deviation from a true function f using the mean square error, signal-to-noise ratio and the Pearson R2 coefficient. We show that, given very sparse data, cubic splines constitute a more precise interpolation method than deep neural networks and multivariate adaptive regression splines. In contrast, machine-learning models are robust to noise and can outperform splines after a training data threshold is met. Our study aims to provide a general framework for interpolating one-dimensional signals, often the result of complex scientific simulations or laboratory experiments.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available