4.5 Review

A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures

期刊

NEURAL COMPUTATION
卷 31, 期 7, 页码 1235-1270

出版社

MIT PRESS
DOI: 10.1162/neco_a_01199

关键词

-

资金

  1. National Nature Science Foundation of China [61773386, 61573365, 61573366, 61573076]
  2. Young Elite Scientists Sponsorship Program of China Association for Science and Technology [2016QNRC001, 2018YFB1306100]

向作者/读者索取更多资源

Recurrent neural networks (RNNs) have been widely adopted in research areas concerned with sequential data, such as text, audio, and video. However, RNNs consisting of sigma cells or tanh cells are unable to learn the relevant information of input data when the input gap is large. By introducing gate functions into the cell structure, the long short-term memory (LSTM) could handle the problem of long-term dependencies well. Since its introduction, almost all the exciting results based on RNNs have been achieved by the LSTM. The LSTM has become the focus of deep learning. We review the LSTM cell and its variants to explore the learning capacity of the LSTM cell. Furthermore, the LSTM networks are divided into two broad categories: LSTM-dominated networks and integrated LSTM networks. In addition, their various applications are discussed. Finally, future research directions are presented for LSTM networks.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据