Federated Learning (FL) is a framework where multiple parties can train a model jointly without sharing private data. Private information protection is a critical problem in FL. However, the communication… Click to show full abstract
Federated Learning (FL) is a framework where multiple parties can train a model jointly without sharing private data. Private information protection is a critical problem in FL. However, the communication overheads of existing solutions are too heavy for IoT devices in resource‐constrained environments. Additionally, they cannot ensure robustness when IoT devices become offline. In this paper, Democratic Federated Learning (DemoFL) is proposed, which is a privacy‐preserving FL framework that has sufficiently low communication overheads. DemoFL involves a consensus module to ensure the system is robust. It also utilizes a tree structure to reduce the time communication overheads and realizes high robustness without reducing accuracy. The proposed algorithm reduces the communication complexity of aggregation at training by M $M$ times, M $M$ being a controllable parameter. Sufficient experiments have been conducted to evaluate the efficiency of the proposed method. The experimental results also demonstrate the practicality of the proposed framework for IoT devices in unstable environments.
               
Click one of the above tabs to view related content.