LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

IoTSL: Toward Efficient Distributed Learning for Resource-Constrained Internet of Things

Photo from wikipedia

Recently proposed split learning (SL) is a promising distributed machine learning paradigm that enables machine learning without accessing the raw data of the clients. SL can be viewed as one… Click to show full abstract

Recently proposed split learning (SL) is a promising distributed machine learning paradigm that enables machine learning without accessing the raw data of the clients. SL can be viewed as one specific type of serial federation learning. However, deploying SL on resource-constrained Internet of Things (IoT) devices still has some limitations, including high communication costs and catastrophic forgetting problems caused by imbalanced data distribution of devices. In this article, we design and implement IoTSL, which is an efficient distributed learning framework for efficient cloud-edge collaboration in IoT systems. IoTSL combines generative adversarial networks (GANs) and differential privacy techniques to train local data-based generators on participating devices, and generate data with privacy protection. On the one hand, IoTSL pretrains the global model using the generative data, and then fine-tunes the model using the local data to lower the communication cost. On the other hand, the generated data is used to impute the missing classes of devices to alleviate the commonly seen catastrophic forgetting phenomenon. We use three common data sets to verify the proposed framework. Extensive experimental results show that compared to the conventional SL, IoTSL significantly reduces communication costs, and efficiently alleviates the catastrophic forgetting phenomenon.

Keywords: efficient distributed; constrained internet; distributed learning; internet things; resource constrained

Journal Title: IEEE Internet of Things Journal
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.