Federated learning enables distributed model training over various computing nodes, e.g., mobile devices, where instead of sharing raw user data, computing nodes can solely commit model updates without compromising data… Click to show full abstract
Federated learning enables distributed model training over various computing nodes, e.g., mobile devices, where instead of sharing raw user data, computing nodes can solely commit model updates without compromising data privacy. The quality of federated learning relies on the model updates contributed by computing nodes training with their local data. However, with various factors (e.g., training data size, mislabeled data samples, skewed data distributions), the model update qualities of computing nodes can vary dramatically, while inclusively aggregating low-quality model updates can deteriorate the global model quality. To achieve efficient federated learning, in this paper, we propose a novel framework named FAIR, i.e., Federated leArning with qualIty awaReness. Particularly, FAIR integrates three major components: 1) learning quality estimation: we adopt the model aggregation weight (learned in the third component) to reversely quantify the individual learning quality of nodes in a privacy-preserving manner, and leverage the historical learning records to infer the next-round learning quality; 2) quality-aware incentive mechanism: within the recruiting budget, we model a reverse auction problem to stimulate the participation of high-quality and low-cost computing nodes, and the method is proved to be truthful, individually rational, and computationally efficient; and 3) auto-weighted model aggregation: based on the gradient descent method, we devise an auto-weighted model aggregation algorithm to automatically learn the optimal aggregation weights to further enhance the global model quality. Based on real-world datasets and learning tasks, extensive experiments are conducted to demonstrate the efficacy of FAIR.
               
Click one of the above tabs to view related content.