期刊
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS
卷 70, 期 4, 页码 1366-1370出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCSII.2022.3224470
关键词
Training; Neural networks; Hardware acceleration; Costs; Standards; Programming; Probabilistic logic; Resistive random access memory (RRAM); neuromorphic accelerator; write-verify; Bayesian method
Resistive random access memory (RRAM)-based neuromorphic hardware accelerators are attractive for neural network acceleration due to their high energy efficiency. However, the variations of RRAM can cause significant conductance deviation and performance degradation. A novel write-verify scheme is proposed to transfer weights with different acceptable error margins, achieving a high-speed and high-efficiency write-verify process.
Resistive random access memory (RRAM)-based neuromorphic hardware accelerators are attractive platforms for neural network acceleration due to their high energy efficiency. However, the inherent variations of RRAM, arising from diffusion or recombination of oxygen vacancies, can cause significant conductance deviation from the target value, resulting in noticeable performance degradation. In practical ex situ training, write-verify methods are widely adopted to avoid this issue when transferring a trained network model. However, the intense reading and reprogramming operations make the conventional write-verify methods require extensive programming time and energy. In this brief, for the first time, we propose a novel write-verify scheme that can transfer each weight with a different acceptable error margin to achieve a high-speed and high-efficiency write-verify scheme while maintaining network performance. Our experimental results show that the speed and energy efficiency of the write-verify process can be improved significantly, by up to x3.4 similar to x9.0 and x4.1 similar to x14.1, respectively.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据