Journal
JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING
Volume 166, Issue -, Pages 15-31Publisher
ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1016/j.jpdc.2022.03.001
Keywords
Edge caching; Energy-latency tradeoffs; Dynamic service migration; Deep Q-Network
Categories
Funding
- National Natural Science Foundation of China (NSFC) [62171330, 61873341]
- Open Project of Key Laboratory of Higher Education of Sichuan Province for Enterprise Informationalization and Internet of Things [2021WYJ01]
- Open Project of Science and Technology on Informa-tion Systems Engineering Laboratory
Ask authors/readers for more resources
Mobile edge computing is a technology that sinks computing and storage capabilities to the edge of the network to provide reliable and low-latency services. This paper proposes a cooperative edge caching strategy based on energy-latency balance and an improved service migration method to solve the challenges of high power consumption, latency, and service interruptions.
Mobile edge computing sinks computing and storage capabilities to the edge of the network to provide reliable and low-latency services. However, the mobility of users and the limited coverage of edge servers can cause service interruptions and reduce service quality. A cooperative edge caching strategy based on energy-latency balance is proposed to solve high power consumption and latency caused by processing computationally intensive applications. In the cache selection phase, the request prediction method based on a deep neural network improves the cache hit rate. In the cache placement stage, the objective function is established by comprehensively considering power consumption and latency, and We use the branch-and-bound algorithm to get the optimal value. We propose an improved service migration method to solve the problem of service interruption caused by user movement. The service migration problem is modeled using a Markov decision process (MDP). The optimization goal is to reduce service latency and improve user experience under the premise of specified cost and computing resources. Finally, the optimal solution of the model is solved by the deep Q-Network (DQN) algorithm. Experiments show that our edge caching algorithm has lower latency and energy consumption than other algorithms in the same conditions. The service migration algorithm proposed in this paper is superior to different service migration algorithms in migration cost and success rate. (c) 2022 Elsevier Inc. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available