4.6 Article

NAS4RRAM: neural network architecture search for inference on RRAM-based accelerators

期刊

SCIENCE CHINA-INFORMATION SCIENCES
卷 64, 期 6, 页码 -

出版社

SCIENCE PRESS
DOI: 10.1007/s11432-020-3245-7

关键词

network architecture search (NAS); neural networks; RRAM-based accelerator; hardware noise; quantization

资金

  1. National Key Research and Development Project of China [2018YFB-1003304]
  2. National Natural Science Foundation of China [61832020, 62032001]

向作者/读者索取更多资源

The text highlights the requirements for deploying neural networks on RRAM-based accelerators and proposes a framework using NAS method to design networks with high prediction accuracy that meet these requirements.
The RRAM-based accelerators enable fast and energy-efficient inference for neural networks. However, there are some requirements to deploy neural networks on RRAM-based accelerators, which are not considered in existing neural networks. (1) Because the noise problem and analog-digital converters/digital-analog converters (ADC/DAC) affect the prediction accuracy, they should be modeled in networks. (2) Because the weights are mapped to the RRAM cells, they should be quantized, and the number of weights is limited by the number of RRAM cells in the accelerator. These requirements motivate us to customize the hardware-friendly network for the RRAM-based accelerator. We take the idea of network architecture search (NAS) to design networks with high prediction accuracy that meet the requirements. We propose a framework called NAS4RRAM to search for the optimal network on the given RRAM-based accelerator. The experiments demonstrate that NAS4RRAM can apply to different RRAM-based accelerators with different scales. The performance of searched networks outperforms the manually designed ResNet.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据