期刊
NEURAL NETWORKS
卷 108, 期 -, 页码 217-223出版社
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2018.08.012
关键词
Neural network; Resistive random-access memory (RRAM); On-chip learning; Multilayer perceptron (MLP); Neuromorphic computing
资金
- Beijing Advanced Innovation Center for Future Chip (ICFC)
- National Key Research and Development Program of China [2017YFB0405604]
Currently, powerful deep learning models usually require significant resources in the form of processors and memory, which leads to very high energy consumption. The emerging resistive random access memory (RRAM) has shown great potential for constructing a scalable and energy-efficient neural network. However, it is hard to port a high-precision neural network from conventional digital CMOS hardware systems to analog RRAM systems owing to the variability of RRAM devices. A suitable on-chip learning algorithm should be developed to retrain or improve the performance of the neural network. In addition, determining how to integrate the periphery digital computations and analog RRAM crossbar is still a challenge. Here, we propose an on-chip learning algorithm, named sign backpropagation (SBP), for RRAM-based multilayer perceptron (MLP) with binary interfaces (0, 1) in forward process and 2-bit (+/- 1, 0) in backward process. The simulation results show that the proposed method and architecture can achieve a comparable classification accuracy with MLP on MNIST dataset, meanwhile it can save area and energy cost by the calculation and storing of the intermediate results and take advantages of the RRAM crossbar potential in neuromorphic computing. (C) 2018 The Author(s). Published by Elsevier Ltd.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据