3.8 Proceedings Paper

The Impact of GPU DVFS on the Energy and Performance of Deep Learning: an Empirical Study

出版社

ASSOC COMPUTING MACHINERY
DOI: 10.1145/3307772.3328315

关键词

Graphics Processing Units; Dynamic Voltage and Frequency Scaling; Deep Convolutional Neural Network

资金

  1. Hong Kong RGC GRF grant [HKBU 12200418]

向作者/读者索取更多资源

Over the past years, great progress has been made in improving the computing power of general-purpose graphics processing units (GPGPUs), which facilitates the prosperity of deep neural networks (DNNs) in multiple fields like computer vision and natural language processing. A typical DNN training process repeatedly updates tens of millions of parameters, which not only requires huge computing resources but also consumes significant energy. In order to train DNNs in a more energy-efficient way, we empirically investigate the impact of GPU Dynamic Voltage and Frequency Scaling (DVFS) on the energy consumption and performance of deep learning. Our experiments cover a wide range of GPU architectures, DVFS settings, and DNN configurations. We observe that, compared to the default core frequency settings of three tested GPUs, the optimal core frequency can help conserve 8.7%similar to 23.1% energy consumption for different DNN training cases. Regarding the inference, the benefits vary from 19.6%similar to 26.4%. Our findings suggest that GPU DVFS has great potentials to help develop energy efficient DNN training/inference schemes.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据