Journal
IEEE TRANSACTIONS ON MOBILE COMPUTING
Volume 14, Issue 12, Pages 2516-2529Publisher
IEEE COMPUTER SOC
DOI: 10.1109/TMC.2015.2405539
Keywords
Mobile cloud; offloading; offloading failure; mobility; Markov decision process; threshold policy
Funding
- Singapore Ministry of Education (MOE) Tier 1 [RG18/13, RG33/12]
Ask authors/readers for more resources
The emergence of mobile cloud computing enables mobile users to offload applications to nearby mobile resource-rich devices (i.e., cloudlets) to reduce energy consumption and improve performance. However, due to mobility and cloudlet capacity, the connections between a mobile user and mobile cloudlets can be intermittent. As a result, offloading actions taken by the mobile user may fail (e.g., the user moves out of communication range of cloudlets). In this paper, we develop an optimal offloading algorithm for the mobile user in such an intermittently connected cloudlet system, considering the users' local load and availability of cloudlets. We examine users' mobility patterns and cloudlets' admission control, and derive the probability of successful offloading actions analytically. We formulate and solve a Markov decision process (MDP) model to obtain an optimal policy for the mobile user with the objective to minimize the computation and offloading costs. Furthermore, we prove that the optimal policy of the MDP has a threshold structure. Subsequently, we introduce a fast algorithm for energy-constrained users to make offloading decisions. The numerical results show that the analytical form of the successful offloading probability is a good estimation in various mobility cases. Furthermore, the proposed MDP offloading algorithm for mobile users outperforms conventional baseline schemes.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available