4.6 Article

Economic LSTM Approach for Recurrent Neural Networks

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCSII.2019.2924663

关键词

Logic gates; Hardware; Computer architecture; Simulation; Recurrent neural networks; Training; Microprocessors; LSTM; neural network; deep learning; image recognition; hardware recurrent neural networks; ELSTM

向作者/读者索取更多资源

Recurrent Neural Networks (RNNs) have become a popular method for learning sequences of data. It is sometimes tough to parallelize all RNN computations on conventional hardware due to its recurrent nature. One challenge of RNN is to find its optimal structure for RNN because of computing complex hidden units that exist. This brief presents a new approach to Long Short-Term Memory (LSTM) that aims to reduce the cost of the computation unit. The proposed Economic LSTM (ELSTM) is designed using a few hardware units to perform its functionality. ELSTM has fewer units compared to the existing LSTM versions which makes it very attractive in processing speed and hardware design cost. The proposed approach is tested using three datasets and compared with other methods. The simulation results show the proposed method has comparable accuracy with other methods. At the hardware level, the proposed method is implemented on Altera FPGA.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据