4.1 Article

LSTM recurrent networks learn simple context-free and context-sensitive languages

期刊

IEEE TRANSACTIONS ON NEURAL NETWORKS
卷 12, 期 6, 页码 1333-1340

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/72.963769

关键词

context-free languages (CFLs); context-sensitive languages (CSLs); long short-term memory (LSTM); recurrent neural networks (RNNs)

向作者/读者索取更多资源

Previous work on learning regular languages from exemplary training sequences showed that long short-term memory (LSTM) outperforms traditional recurrent neural networks (RNNs). Here we demonstrate LSTMs superior performance on context-free language (CFL) benchmarks for RNNs, and show that it works even better than previous hardwired or highly specialized architectures. To the best of our knowledge, LSTM variants are also the first RNNs to learn a simple context-sensitive language (CSL), namely a(n)b(n)c(n).

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.1
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据