期刊
IFAC PAPERSONLINE
卷 53, 期 2, 页码 1243-1248出版社
ELSEVIER
DOI: 10.1016/j.ifacol.2020.12.1342
关键词
Nonlinear system identification; Recurrent Neural Networks; Gated Recurrent Units
Recurrent Neural Networks are applied in areas such as speech recognition, natural language and video processing, and the identification of nonlinear state space models. Conventional Recurrent Neural Networks, e.g. the Elman Network, are hard to train. A more recently developed class of recurrent neural networks, so-called Gated Units, outperform their counterparts on virtually every task. This paper aims to provide additional insights into the differences between RNNs and Gated Units in order to explain the superior perfomance of gated recurrent units. It is argued, that Gated Units are easier to optimize not because they solve the vanishing gradient problem, but because they circumvent the emergence of large local gradients. Copyright (C) 2020 The Authors.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据