Driven by technologies such as IoT-enabled health care, machine learning applications at the edge, and industrial automation, mobile edge and fog computing paradigms have reinforced a general trend toward decentralized… Click to show full abstract
Driven by technologies such as IoT-enabled health care, machine learning applications at the edge, and industrial automation, mobile edge and fog computing paradigms have reinforced a general trend toward decentralized computing, where any network node can route traffic, compute tasks, and store data, possibly at the same time. In many such computing environments, there is a need to cache significant amounts of data, which may include large data sets, machine learning models, or executable code. In this work, we propose a framework for joint computation scheduling, caching, and request forwarding within such decentralized computing environments. We first characterize the stability region of a “genie-aided” computing network where data required by computation are instantly accessible, and develop a throughput optimal control policy for this model. Based on this, we develop a practically implementable distributed and adaptive algorithm, and show that it exhibits superior performance in terms of average task completion time, when compared to several baseline policies.
               
Click one of the above tabs to view related content.