期刊
IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS
卷 42, 期 5, 页码 1490-1503出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCAD.2022.3205872
关键词
Computation-in-memory; magnetic tunnel junction (MTJ); Neuromorphic computing; neuromorphic; nonvolatile memory (NVM); retention faults
In this work, an approximate scrubbing technique is proposed for NVM-based neuromorphic fabric to mitigate unidirectional retention faults and improve the inference accuracy of MLP and CNN with negligible storage overhead.
Neuromorphic computation-in-memory fabric based on emerging nonvolatile memories considered an attractive option to accelerate neural networks (NNs) in hardware as they provide high-performance, low-power, and reduced data movement. Although nonvolatile resistive memories (NVMs) offer many benefits, they are susceptible to data retention faults, where previously stored data are not retained after a certain amount of time due to external influence. These faults are more likely to happen unidirectional and severely impact the inference accuracy of the hardware implementation of NNs since the synaptic weights stored in the NVMs are subject to retention faults. In this work, we propose an approximate scrubbing technique for NVM-based neuromorphic fabric to mitigate unidirectional retention faults with virtually zero storage overhead depending on the definition of scrub area for multilayer perceptron (MLP) and convolutional NNs (CNNs). The training of the NNs is adjusted accordingly to meet the requirements of the proposed approximate scrubbing scheme. On different benchmarks, the proposed scrubbing approach can improve the inference accuracy up to 85.51% for MLP and 87.76% for CNN over the expected device operational time with negligible storage overhead.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据