期刊
INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS
卷 14, 期 1-2, 页码 281-301出版社
WORLD SCIENTIFIC PUBL CO PTE LTD
DOI: 10.1142/S0218213005002107
关键词
neural networks; classification; relaxation approach; MLP training
A method which modifies the objective function used for designing neural network classifiers is presented. The classical mean-square error criteria is relaxed by introducing two types of local error bias which are treated like free parameters. Open and closed form solutions are given for finding these bias parameters. The new objective function is seamlessly integrated into existing training algorithms such as back propagation (BP), output weight optimization (OWO), and hidden weight optimization (HWO). The resulting algorithms are successfully applied in training neural net classifiers having a linear final layer. Classifiers are trained and tested on several data sets from pattern recognition applications. Improvement over classical iterative regression methods is clearly demonstrated.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据