4.5 Article

Accelerating Inference of Convolutional Neural Networks Using In-memory Computing

期刊

出版社

FRONTIERS MEDIA SA
DOI: 10.3389/fncom.2021.674154

关键词

convolutional neural network; in-memory computing; computational memory; AI hardware; neural network acceleration

向作者/读者索取更多资源

In-memory computing (IMC) is a non-von Neumann paradigm that offers energy-efficient, high throughput hardware for deep learning applications. This approach requires a rethink of architectural design choices due to its different execution pattern compared to previous computational paradigms. When applied to Convolution Neural Networks (CNNs), IMC hardware can achieve throughput and latency beyond current state-of-the-art for image classification tasks.
In-memory computing (IMC) is a non-von Neumann paradigm that has recently established itself as a promising approach for energy-efficient, high throughput hardware for deep learning applications. One prominent application of IMC is that of performing matrix-vector multiplication in O(1) time complexity by mapping the synaptic weights of a neural-network layer to the devices of an IMC core. However, because of the significantly different pattern of execution compared to previous computational paradigms, IMC requires a rethinking of the architectural design choices made when designing deep-learning hardware. In this work, we focus on application-specific, IMC hardware for inference of Convolution Neural Networks (CNNs), and provide methodologies for implementing the various architectural components of the IMC core. Specifically, we present methods for mapping synaptic weights and activations on the memory structures and give evidence of the various trade-offs therein, such as the one between on-chip memory requirements and execution latency. Lastly, we show how to employ these methods to implement a pipelined dataflow that offers throughput and latency beyond state-of-the-art for image classification tasks.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据