4.5 Review

A survey on the application of recurrent neural networks to statistical language modeling

Journal

COMPUTER SPEECH AND LANGUAGE
Volume 30, Issue 1, Pages 61-98

Publisher

ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD
DOI: 10.1016/j.csl.2014.09.005

Keywords

Recurrent neural networks; Natural language processing; Language modeling; Speech recognition; Machine translation

Funding

  1. EU (FET-open call) project MUSE (Machine Understanding for interactive StorytElling) [FP7-296703]

Ask authors/readers for more resources

In this paper, we present a survey on the application of recurrent neural networks to the task of statistical language modeling. Although it has been shown that these models obtain good performance on this task, often superior to other state-of-the-art techniques, they suffer from some important drawbacks, including a very long training time and limitations on the number of context words that can be taken into account in practice. Recent extensions to recurrent neural network models have been developed in an attempt to address these drawbacks. This paper gives an overview of the most important extensions. Each technique is described and its performance on statistical language modeling, as described in the existing literature, is discussed. Our structured overview makes it possible to detect the most promising techniques in the field of recurrent neural networks, applied to language modeling, but it also highlights the techniques for which further research is required. (C) 2014 The Authors. Published by Elsevier Ltd.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available