LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

DNN Deployment, Task Offloading, and Resource Allocation for Joint Task Inference in IIoT

Photo by 2hmedia from unsplash

Joint task inference, which fully utilizes end edge cloud cooperation, can effectively enhance the performance of deep neural network (DNN) inference services in the industrial internet of things (IIoT) applications.… Click to show full abstract

Joint task inference, which fully utilizes end edge cloud cooperation, can effectively enhance the performance of deep neural network (DNN) inference services in the industrial internet of things (IIoT) applications. In this paper, we propose a novel joint resource management scheme for a multi task and multi service scenario consisting of multiple sensors, a cloud server, and a base station equipped with an edge server . A time slotted system model is proposed, incorporating DNN deployment, data size control, task offloading, computing resource allocation, and wireless channel allocation. Among them, the DNN deployment is to deploy proper DNNs on the edge server under its total resource constraint, and the data size control is to make trade off between task inference accuracy and task transmission delay through changing task da ta size. Our goal is to minimize the total cost including total task processing delay and total error inference penalty while guaranteeing long term task queue stability and all task inference accuracy requirements. Leveraging the Lyapunov optimization, we first transform the optimization problem into a deterministic problem for each time slot. Then, a deep deterministic policy gradient (DDPG) based deep reinforcement learning (DRL) algorithm is designed to provide the near optimal solution. We further desi gn a fast numerical method for the data size control sub problem to reduce the training complexity of the DRL model, and design a penalty mechanism to prevent frequent optimizations of DNN deployment. Extensive experiments are conducted by varying differen t crucial parameters. The superiority of our scheme is demonstrated in comparison with 3 other schemes.

Keywords: resource; task; dnn deployment; task inference; allocation; inference

Journal Title: IEEE Transactions on Industrial Informatics
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.