4.2 Article Proceedings Paper

Iterative design of neural network classifiers through regression

Journal

Publisher

WORLD SCIENTIFIC PUBL CO PTE LTD
DOI: 10.1142/S0218213005002107

Keywords

neural networks; classification; relaxation approach; MLP training

Ask authors/readers for more resources

A method which modifies the objective function used for designing neural network classifiers is presented. The classical mean-square error criteria is relaxed by introducing two types of local error bias which are treated like free parameters. Open and closed form solutions are given for finding these bias parameters. The new objective function is seamlessly integrated into existing training algorithms such as back propagation (BP), output weight optimization (OWO), and hidden weight optimization (HWO). The resulting algorithms are successfully applied in training neural net classifiers having a linear final layer. Classifiers are trained and tested on several data sets from pattern recognition applications. Improvement over classical iterative regression methods is clearly demonstrated.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available