4.6 Article

On extended long short-term memory and dependent bidirectional recurrent neural network

期刊

NEUROCOMPUTING
卷 356, 期 -, 页码 151-161

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2019.04.044

关键词

Recurrent neural networks; Long short-term memory; Gated recurrent unit; Bidirectional recurrent neural networks; Encoder-decoder; Natural language processing

向作者/读者索取更多资源

In this work, we first analyze the memory behavior in three recurrent neural networks (RNN) cells; namely, the simple RNN (SRN), the long short-term memory (LSTM) and the gated recurrent unit (GRU), where the memory is defined as a function that maps previous elements in a sequence to the current output. Our study shows that all three of them suffer rapid memory decay. Then, to alleviate this effect, we introduce trainable scaling factors that act like an attention mechanism to adjust memory decay adaptively. The new design is called the extended LSTM (ELSTM). Finally, to design a system that is robust to previous erroneous predictions, we propose a dependent bidirectional recurrent neural network (DBRNN). Extensive experiments are conducted on different language tasks to demonstrate the superiority of the proposed ELSTM and DBRNN solutions. The ELTSM has achieved up to 30% increase in the labeled attachment score (LAS) as compared to LSTM and GRU in the dependency parsing (DP) task. Our models also outperform other state-of-the-art models such as bi-attention [1] and convolutional sequence to sequence (convseq2seq) [2] by close to 10% in the LAS. (C) 2019 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据