4.0 Article

Minimal Gated Unit for Recurrent Neural Networks

期刊

出版社

SPRINGERNATURE
DOI: 10.1007/s11633-016-1006-2

关键词

Recurrent neural network; minimal gated unit (MGU); gated unit; gate recurrent unit (GRU); long short-term memory (LSTM); deep learning

资金

  1. National Natural Science Foundation of China [61422203, 61333014]
  2. National Key Basic Research Program of China [2014CB340501]

向作者/读者索取更多资源

Recurrent neural networks (RNN) have been very successful in handling sequence data. However, understanding RNN and finding the best practices for RNN learning is a difficult task, partly because there are many competing and complex hidden units, such as the long short-term memory (LSTM) and the gated recurrent unit (GRU). We propose a gated unit for RNN, named as minimal gated unit (MGU), since it only contains one gate, which is a minimal design among all gated hidden units. The design of MGU benefits from evaluation results on LSTM and GRU in the literature. Experiments on various sequence data show that MGU has comparable accuracy with GRU, but has a simpler structure, fewer parameters, and faster training. Hence, MGU is suitable in RNN's applications. Its simple architecture also means that it is easier to evaluate and tune, and in principle it is easier to study MGU's properties theoretically and empirically.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.0
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据