LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Privacy-Preserving Federated Deep Learning With Irregular Users

Photo by patrickltr from unsplash

Federated deep learning has been widely used in various fields. To protect data privacy, many privacy-preservingapproaches have been designed and implemented in various scenarios. However, existing works rarely consider a… Click to show full abstract

Federated deep learning has been widely used in various fields. To protect data privacy, many privacy-preservingapproaches have been designed and implemented in various scenarios. However, existing works rarely consider a fundamental issue that the data shared by certain users (called irregular users) may be of low quality. Obviously, in a federated training process, data shared by many irregular users may impair the training accuracy, or worse, lead to the uselessness of the final model. In this article, we propose PPFDL, a Privacy-Preserving Federated Deep Learning framework with irregular users. In specific, we design a novel solution to reduce the negative impact of irregular users on the training accuracy, which guarantees that the training results are mainly calculated from the contribution of high-quality data. Meanwhile, we exploit Yao's garbled circuits and additively homomorphic cryptosystems to ensure the confidentiality of all user-related information. Moreover, PPFDL is also robust to users dropping out during the whole implementation. This means that each user can be offline at any subprocess of training, as long as the remaining online users can still complete the training task. Extensive experiments demonstrate the superior performance of PPFDL in terms of training accuracy, computation, and communication overheads.

Keywords: federated deep; preserving federated; irregular users; privacy preserving; deep learning

Journal Title: IEEE Transactions on Dependable and Secure Computing
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.