4.5 Article

Motivation for and Evaluation of the First Tensor Processing Unit

期刊

IEEE MICRO
卷 38, 期 3, 页码 10-19

出版社

IEEE COMPUTER SOC
DOI: 10.1109/MM.2018.032271057

关键词

-

向作者/读者索取更多资源

The first-generation tensor processing unit (TPU) runs deep neural network (DNN) inference 15-30 times faster with 30-80 times better energy efficiency than contemporary CPUs and GPUs in similar semiconductor technologies. This domain-specific architecture (DSA) is a custom chip that has been deployed in Google datacenters since 2015, where it serves billions of people.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据