4.7 Article

JointDNN: An Efficient Training and Inference Engine for Intelligent Mobile Cloud Computing Services

Journal

IEEE TRANSACTIONS ON MOBILE COMPUTING
Volume 20, Issue 2, Pages 565-576

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/TMC.2019.2947893

Keywords

Deep neural networks; intelligent services; mobile computing; cloud computing

Funding

  1. NSF SHF
  2. DARPA MTO
  3. USC Annenberg Fellowship

Ask authors/readers for more resources

The paper introduces JointDNN, an engine for collaborative computation between mobile devices and the cloud for DNNs, which improves energy and performance efficiency for mobile devices while reducing the workload and communications for the cloud server. By processing some layers on mobile devices and others on the cloud server, JointDNN can adapt to battery limitations, server load constraints, and quality of service requirements, achieving significant reductions in latency and mobile energy consumption compared to traditional approaches.
Deep learning models are being deployed in many mobile intelligent applications. End-side services, such as intelligent personal assistants, autonomous cars, and smart home services often employ either simple local models on the mobile or complex remote models on the cloud. However, recent studies have shown that partitioning the DNN computations between the mobile and cloud can increase the latency and energy efficiencies. In this paper, we propose an efficient, adaptive, and practical engine, JointDNN, for collaborative computation between a mobile device and cloud for DNNs in both inference and training phase. JointDNN not only provides an energy and performance efficient method of querying DNNs for the mobile side but also benefits the cloud server by reducing the amount of its workload and communications compared to the cloud-only approach. Given the DNN architecture, we investigate the efficiency of processing some layers on the mobile device and some layers on the cloud server. We provide optimization formulations at layer granularity for forward- and backward-propagations in DNNs, which can adapt to mobile battery limitations and cloud server load constraints and quality of service. JointDNN achieves up to 18 and 32 times reductions on the latency and mobile energy consumption of querying DNNs compared to the status-quo approaches, respectively.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available