4.8 Article

Synaptic Resistors for Concurrent Inference and Learning with High Energy Efficiency

期刊

ADVANCED MATERIALS
卷 31, 期 18, 页码 -

出版社

WILEY-V C H VERLAG GMBH
DOI: 10.1002/adma.201808032

关键词

carbon nanotube; concurrent inference and learning; high energy efficiency; parallelism; synaptic resistor

资金

  1. Air Force Office of Scientific Research (AFOSR) [FA9550-15-1-0056, FA9550-16-1-0087]

向作者/读者索取更多资源

The fastest supercomputer, Summit, has a speed comparable to the human brain, but is much less energy-efficient (approximate to 10(10 )FLOPS W-1, floating point operations per second per watt) than the brain (approximate to 10(15 )FLOPS W-1). The brain processes and learns from big data concurrently via trillions of synapses in parallel analog mode. By contrast, computers execute algorithms on physically separated logic and memory transistors in serial digital mode, which fundamentally restrains computers from handling big data efficiently. The existing electronic devices can perform inference with high speeds and energy efficiencies, but they still lack the synaptic functions to facilitate concurrent convolutional inference and correlative learning efficiently like the brain. In this work, synaptic resistors are reported to emulate the analog convolutional signal processing, correlative learning, and nonvolatile memory functions of synapses. By circumventing the fundamental limitations of computers, a synaptic resistor circuit performs speech inference and learning concurrently in parallel analog mode with an energy efficiency of approximate to 1.6 x 10(17 )FLOPS W-1, which is about seven orders of magnitudes higher than that of the Summit supercomputer. Scaled-up synstor circuits could circumvent the fundamental limitations in computers, and facilitate real-time inference and learning from big data with high efficiency and speed in intelligent systems.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据