4.5 Article

NeuroScrub plus : Mitigating Retention Faults Using Flexible Approximate Scrubbing in Neuromorphic Fabric Based on Resistive Memories

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCAD.2022.3205872

关键词

Computation-in-memory; magnetic tunnel junction (MTJ); Neuromorphic computing; neuromorphic; nonvolatile memory (NVM); retention faults

向作者/读者索取更多资源

In this work, an approximate scrubbing technique is proposed for NVM-based neuromorphic fabric to mitigate unidirectional retention faults and improve the inference accuracy of MLP and CNN with negligible storage overhead.
Neuromorphic computation-in-memory fabric based on emerging nonvolatile memories considered an attractive option to accelerate neural networks (NNs) in hardware as they provide high-performance, low-power, and reduced data movement. Although nonvolatile resistive memories (NVMs) offer many benefits, they are susceptible to data retention faults, where previously stored data are not retained after a certain amount of time due to external influence. These faults are more likely to happen unidirectional and severely impact the inference accuracy of the hardware implementation of NNs since the synaptic weights stored in the NVMs are subject to retention faults. In this work, we propose an approximate scrubbing technique for NVM-based neuromorphic fabric to mitigate unidirectional retention faults with virtually zero storage overhead depending on the definition of scrub area for multilayer perceptron (MLP) and convolutional NNs (CNNs). The training of the NNs is adjusted accordingly to meet the requirements of the proposed approximate scrubbing scheme. On different benchmarks, the proposed scrubbing approach can improve the inference accuracy up to 85.51% for MLP and 87.76% for CNN over the expected device operational time with negligible storage overhead.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据