Journal
INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS
Volume 14, Issue 1-2, Pages 281-301Publisher
WORLD SCIENTIFIC PUBL CO PTE LTD
DOI: 10.1142/S0218213005002107
Keywords
neural networks; classification; relaxation approach; MLP training
Ask authors/readers for more resources
A method which modifies the objective function used for designing neural network classifiers is presented. The classical mean-square error criteria is relaxed by introducing two types of local error bias which are treated like free parameters. Open and closed form solutions are given for finding these bias parameters. The new objective function is seamlessly integrated into existing training algorithms such as back propagation (BP), output weight optimization (OWO), and hidden weight optimization (HWO). The resulting algorithms are successfully applied in training neural net classifiers having a linear final layer. Classifiers are trained and tested on several data sets from pattern recognition applications. Improvement over classical iterative regression methods is clearly demonstrated.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available