Journal
2019 IEEE 25TH INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS (ICPADS)
Volume -, Issue -, Pages 438-445Publisher
IEEE COMPUTER SOC
DOI: 10.1109/ICPADS47876.2019.00069
Keywords
DNN inference; edge computing; multi-path; partition; offloading
Funding
- National Key R&D Program of China [2018AAA0100500, 2017YFB1003000]
- National Natural Science Foundation of China [61872079, 61572129, 61602112, 61702096, 61632008, 61702097]
- Natural Science Foundation of Jiangsu Province [BK20160695, BK20170689]
- Jiangsu Provincial Key Laboratory of Network and Information Security [BM2003201]
- Key Laboratory of Computer Network and Information Integration of Ministry of Education of China [93K-9]
- Collaborative Innovation Center of Novel Software Technology and Industrialization
- Collaborative Innovation Center of Wireless Communications Technology
Ask authors/readers for more resources
Implementing intelligent mobile applications on IoT devices with DNN technology has become an inevitable trend. Due to the limitations of the size of DNN model deployed onto end devices and the instability of wide-area network transmission, either End-only mode or Cloud-only mode cannot guarantee the reasonable latency and recognition accuracy simultaneously. A better solution is to exploit the edge computing, where the existing edge computing execution framework and offloading mechanism for DNN inference suffer unnecessary computational overheads and underutilized computing capacity of end and edge. To address these shortcomings, an adaptive distributed DNN inference acceleration framework for edge computing environment is proposed in this paper, where DNN computation path optimization and DNN computation partition optimization are taken into consideration. The evaluations demonstrate that our method can effectively accelerate the DNN inference compared to the state-of-the-art methods.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available