4.6 Article Proceedings Paper

Syntactic systematicity in sentence processing with a recurrent self-organizing network

Journal

NEUROCOMPUTING
Volume 71, Issue 7-9, Pages 1172-1179

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2007.11.025

Keywords

recurrent neural network; self-organization; next-word prediction; systematicity

Ask authors/readers for more resources

As potential candidates for explaining human cognition, connectionist models of sentence processing must demonstrate their ability to behave systematically, generalizing from a small training set. It has recently been shown that simple recurrent networks and, to a greater extent, echo-state networks possess some ability to generalize in artificial language learning tasks. We investigate this capacity for a recently introduced model that consists of separately trained modules: a recursive self-organizing module for learning temporal context representations and a feedforward two-layer perceptron module for next-word prediction. We show that the performance of this architecture is comparable with echo-state networks. Taken together, these results weaken the criticism of connectionist approaches, showing that various general recursive connectionist architectures share the potential of behaving systematically. (c) 2008 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available