3.8 Proceedings Paper

Low Power In-Memory Implementation of Ternary Neural Networks with Resistive RAM-Based Synapse

出版社

IEEE
DOI: 10.1109/aicas48895.2020.9073877

关键词

-

资金

  1. ERC Grant NANOINFER [715872]
  2. ANR grant NEURONIC [ANR-18-CE24-0009]
  3. Agence Nationale de la Recherche (ANR) [ANR-18-CE24-0009] Funding Source: Agence Nationale de la Recherche (ANR)
  4. European Research Council (ERC) [715872] Funding Source: European Research Council (ERC)

向作者/读者索取更多资源

The design of systems implementing low precision neural networks with emerging memories such as resistive random access memory (RRAM) is a major lead for reducing the energy consumption of artificial intelligence (AI). Multiple works have for example proposed in-memory architectures to implement low power binarized neural networks. These simple neural networks, where synaptic weights and neuronal activations assume binary values, can indeed approach state-of-the-art performance on vision tasks. In this work, we revisit one of these architectures where synapses are implemented in a differential fashion to reduce bit errors, and synaptic weights are read using precharge sense amplifiers. Based on experimental measurements on a hybrid 130 nm CMOS/RRAM chip and on circuit simulation, we show that the same memory array architecture can be used to implement ternary weights instead of binary weights, and that this technique is particularly appropriate if the sense amplifier is operated in near-threshold regime. We also show based on neural network simulation on the CIFAR-10 image recognition task that going from binary to ternary neural networks significantly increases neural network performance. These results highlight that AI circuits function may sometimes be revisited when operated in low power regimes.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据