4.6 Article

General bound of overfitting for MLP regression models

Journal

NEUROCOMPUTING
Volume 90, Issue -, Pages 106-110

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2011.11.028

Keywords

Multilayer perceptron; Mean square error; Asymptotic statistic; Models selection

Ask authors/readers for more resources

Multilayer perceptrons (MLP) with one hidden layer have been used for a long time to deal with nonlinear regression. However, in some task, MLP's are too powerful models and a small mean square error (MSE) may be more due to overfitting than to actual modeling. If the noise of the regression model is Gaussian, the overfitting of the model is totally determined by the behavior of the likelihood ratio test statistic (LRTS), however in numerous cases the assumption of normality of the noise is arbitrary if not false. In this paper, we present an universal bound for the overfitting of such model under weak assumptions, this bound is valid without Gaussian or identifiability assumptions. The main application of this bound is to give a hint about determining the true architecture of the MLP model when the number of data goes to infinite. As an illustration, we use this theoretical result to propose and compare effective criteria to find the true architecture of an MLP. (C) 2012 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available