4.6 Article

An Improved RRAM-Based Binarized Neural Network With High Variation-Tolerated Forward/Backward Propagation Module

期刊

IEEE TRANSACTIONS ON ELECTRON DEVICES
卷 67, 期 2, 页码 469-473

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TED.2019.2956967

关键词

Binarized neural networks (BNNs); resistive switching random access memory (RRAM); vector-matrix multiplication (VMM)

资金

  1. National Natural Science Foundation of China [61421005, 61604005, 61334007, 61834001]

向作者/读者索取更多资源

Binarized neural network (BNN) enables resistive switching random access memory (RRAM) with high nonlinearity and nonsymmetry to realize online training, using an RRAM comparator structure. In this work, a new hardware implementation approach is proposed to improve the efficiency of BNN. In the approach, an 1T1R array-based propagation module is introduced and designed to realize the computing acceleration of fully parallel vector-matrix multiplication (VMM) in both forward and backward propagations. Using the 1T1R-based propagation module, high computing efficiency is achieved in both training and inference tasks, improving by $50\times $ and $177\times $ , respectively. To solve the computation error caused by device variation, a novel operation scheme with low gate voltage is proposed. With the operation scheme, the RRAM variation is dramatically suppressed by 74.8 for cycle-to-cycle and 59.9 for device-to-device. It enables high-accuracy VMM calculation and, therefore, achieves 94.7 accuracy with a typical BNN, showing only 0.7 degradation from the ideal variation-free case.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据