LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

FedDual: Pair-Wise Gossip Helps Federated Learning in Large Decentralized Networks

Photo from wikipedia

There is a significant recent interest in collaboratively training a machine learning (ML) model without collecting data to a central server. Federated learning (FL) emerges as an efficient solution mitigating… Click to show full abstract

There is a significant recent interest in collaboratively training a machine learning (ML) model without collecting data to a central server. Federated learning (FL) emerges as an efficient solution mitigating systemic privacy risks and communication costs. However, conventional FL inherited from parameter server designs relies too much on a central server, which may lead to privacy risks, communication bottlenecks, or a single point of failure. In this paper, we propose an asynchronous and hierarchical local gradient aggregation and global model update algorithm, FedDual, under three different security considerations for FL in large decentralized networks. Particularly, FedDual preserves privacy by introducing local differential privacy (LDP) and aggregates local gradients asynchronously and hierarchically via a pair-wise gossip algorithm, which is more competitive than previous gossip-based decentralized FL methods in terms of privacy preservation and communication efficiency, and offers more computational efficiency compared to existing blockchain-assisted decentralized FL methods. Further, we devise a noise cutting trick based on Private Set Intersection (PSI) to mitigate the prediction performance loss of the global model caused by the leveraged LDP. Rigorous analyses show that FedDual helps decentralized FL achieve the same convergence rate of $\mathcal {O}\left({\frac {1}{T}}\right) $ as centralized ML theoretically. Ingenious experiments on MNIST, CIFAR-10, and FEMNIST confirm that the model prediction performance gained from FedDual is close to centralized ML. More importantly, the proposed noise cutting trick helps FedDual to train better global models than LDP-based FL methods in terms of prediction performance and convergence rate.

Keywords: federated learning; large decentralized; gossip; wise gossip; pair wise; decentralized networks

Journal Title: IEEE Transactions on Information Forensics and Security
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.