4.6 Article Proceedings Paper

Adaptive learning algorithms to incorporate additional functional constraints into neural networks

Journal

NEUROCOMPUTING
Volume 35, Issue -, Pages 73-90

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/S0925-2312(00)00296-4

Keywords

adaptive learning algorithm; mapping sensitivity; curvature smoothing; time-series prediction

Ask authors/readers for more resources

In this paper, adaptive learning algorithms to obtain better generalization performance are proposed. We specifically designed cost terms for the additional functionality based on the first- and second-order derivatives of neural activation at hidden layers. In the course of training, these additional cost functions penalize the input-to-output mapping sensitivity and high-frequency components in training data. A gradient-descent method results in hybrid learning rules to combine the error back-propagation, Hebbian rules, and the simple weight decay rules. However, additional computational requirements to the standard error back-propagation algorithm are almost negligible. Theoretical justifications and simulation results are given to verify the effectiveness of the proposed learning algorithms. (C) 2000 Elsevier Science B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available