期刊
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS
卷 66, 期 11, 页码 1885-1889出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCSII.2019.2924663
关键词
Logic gates; Hardware; Computer architecture; Simulation; Recurrent neural networks; Training; Microprocessors; LSTM; neural network; deep learning; image recognition; hardware recurrent neural networks; ELSTM
Recurrent Neural Networks (RNNs) have become a popular method for learning sequences of data. It is sometimes tough to parallelize all RNN computations on conventional hardware due to its recurrent nature. One challenge of RNN is to find its optimal structure for RNN because of computing complex hidden units that exist. This brief presents a new approach to Long Short-Term Memory (LSTM) that aims to reduce the cost of the computation unit. The proposed Economic LSTM (ELSTM) is designed using a few hardware units to perform its functionality. ELSTM has fewer units compared to the existing LSTM versions which makes it very attractive in processing speed and hardware design cost. The proposed approach is tested using three datasets and compared with other methods. The simulation results show the proposed method has comparable accuracy with other methods. At the hardware level, the proposed method is implemented on Altera FPGA.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据