4.8 Article

DNNOff: Offloading DNN-Based Intelligent IoT Applications in Mobile Edge Computing

期刊

IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS
卷 18, 期 4, 页码 2820-2829

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TII.2021.3075464

关键词

Computational modeling; Object oriented modeling; Cloud computing; Servers; Neural networks; Informatics; Estimation; Computation offloading; deep neural networks (DNNs); intelligent Internet of Things (IoT) application; mobile edge computing (MEC); software adaption

资金

  1. National Natural Science Foundation of China [62072108]
  2. Natural Science Foundation of Fujian Province for Distinguished Young Scholars [2020J06014]

向作者/读者索取更多资源

This article introduces a solution for running DNN-based applications on intelligent devices with limited resources, employing computation offloading technology to transfer some computation-intensive tasks to the cloud or edges. By rewriting the source code and dynamically determining the offloading scheme, this solution demonstrates significant performance advantages on a real-world intelligent application.
A deep neural network (DNN) has become increasingly popular in industrial Internet of Things scenarios. Due to high demands on computational capability, it is hard for DNN-based applications to directly run on intelligent end devices with limited resources. Computation offloading technology offers a feasible solution by offloading some computation-intensive tasks to the cloud or edges. Supporting such capability is not easy due to two aspects: Adaptability: offloading should dynamically occur among computation nodes. Effectiveness: it needs to be determined which parts are worth offloading. This article proposes a novel approach, called DNNOff. For a given DNN-based application, DNNOff first rewrites the source code to implement a special program structure supporting on-demand offloading and, at runtime, automatically determines the offloading scheme. We evaluated DNNOff on a real-world intelligent application, with three DNN models. Our results show that, compared with other approaches, DNNOff saves response time by 12.4-66.6% on average.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据