Journal
NEURAL COMPUTATION
Volume 20, Issue 10, Pages 2564-2596Publisher
MIT PRESS
DOI: 10.1162/neco.2008.05-07-527
Keywords
-
Funding
- Royal Society International Joint Project
- RFBR [08-08-00103-a]
Ask authors/readers for more resources
Recurrent neural networks with fixed weights have been shown in practice to successfully classify adaptively signals that vary as a function of time in the presence of additive noise and parametric perturbations. We address the question: Can this ability be explained theoretically? We provide a mathematical proof that these networks have this ability even when parametric perturbations enter the signals nonlinearly. The restrictions that we impose on the signals to be classified are that they satisfy an assumption of nondegeneracy and that noise amplitude is sufficiently small. Further, we demonstrate that the recurrent neural networks may not only classify uncertain signals adaptively but also can recover the values of uncertain parameters of the signals, up to their equivalence classes.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available