Energy-efficient Computation Offloading in Wireless Networks

Open Access
Geng, Yeli
Graduate Program:
Computer Science and Engineering
Doctor of Philosophy
Document Type:
Date of Defense:
August 02, 2018
Committee Members:
  • Guohong Cao, Dissertation Advisor
  • Guohong Cao, Committee Chair
  • George Kesidis, Committee Member
  • Sencun Zhu, Committee Member
  • Zhen Lei, Outside Member
  • Energy consumption
  • Cellular phones
  • Computation offloading
  • Wireless communication.
  • Wireless communication
Due to the increased processing capability of mobile devices, computationally intensive mobile applications such as image/video processing, face recognition and augmented reality are becoming increasingly popular. Such complex applications may quickly drain mobile devices batteries. One viable solution to address this problem utilizes computation offloading, which migrates local computations to resource-rich servers via wireless networks. However, transmitting data between mobile devices and the server also consumes energy. Hence, the key problem becomes how to selectively offloading computationally intensive tasks to reduce the energy consumption, and this dissertation addresses this problem from the following three aspects. First, we design energy-efficient offloading algorithms, taking into account the unique energy characteristics of wireless networks. The cellular interface stays in the high power state after completing a data transmission, consuming a substantial amount of energy (referred to as the {\em long tail} problem) even when there is no network traffic. To solve this problem, we analyze the effects of the long tail on task offloading, and create decision states which represent all possible offloading decisions for each task. Based on these decision states, we design an algorithm to search for the optimal offloading decision assuming perfect knowledge of future tasks, which may not be possible in practice. Thus, we also design and evaluate an efficient online algorithm, which can be deployed on mobile devices. Second, we propose a peer-assisted computation offloading framework to save energy. In cellular networks, the service quality of a mobile device varies based on its location due to practical deployment issues. Mobile devices with poor service quality introduce high communication energy for computation offloading. To address this problem, we propose a peer-assisted computation offloading framework. Through peer to peer interface such as WiFi direct, mobile devices with poor service quality can offload computation tasks to a neighbor with better quality which further transmits them to the cloud through cellular networks. We also propose algorithms to decide which tasks should be offloaded to minimize energy consumption. Finally, we address the problem of energy-efficient computation offloading on multicore-based mobile devices. In the ARM big.LITTLE multicore architecture, the big core has high performance but consumes more energy, whereas the little core is energy efficient but less powerful. Thus, besides deciding locally or remotely running a task, we need to consider the tradeoff of executing local tasks on which CPU cores. We have to consider how to exploit the new architecture to minimize energy while satisfying application completion time constraints. We first formalize the problem which is NP-hard, and then propose a novel heuristic based algorithm to solve it. Evaluation results show that our offloading algorithm can significantly reduce the energy consumption of mobile devices while satisfying the application completion time constraints.