Journal
NEUROCOMPUTING
Volume 57, Issue -, Pages 87-104Publisher
ELSEVIER SCIENCE BV
DOI: 10.1016/j.neucom.2004.01.006
Keywords
recurrent neural network; language; generalization; systematicity
Categories
Ask authors/readers for more resources
Generalization performance in recurrent neural networks is enhanced by cascading several networks. By discretizing abstractions induced in one network, other networks can operate on a coarse symbolic level with increased performance on sparse and structural prediction tasks. The level of systematicity exhibited by the cascade of recurrent networks is assessed on the basis of three language domains. (C) 2004 Elsevier B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available