Federated learning is a machine learning prgadigm that enables the collaborative learning among clients while keeping the privacy of clients’ data. Federated multitask learning (FMTL) deals with the statistic challenge… Click to show full abstract
Federated learning is a machine learning prgadigm that enables the collaborative learning among clients while keeping the privacy of clients’ data. Federated multitask learning (FMTL) deals with the statistic challenge of non-independent and identically distributed (IID) data by training a personalized model for each client, and yet requires all the clients to be always online in each training round. To eliminate the limitation of full-participation, we explore multitask learning associated with model clustering, and first propose a clustered FMTL to achieve the multual-task learning on non-IID data, while simultaneously improving the communication efficiency and the model accuracy. To enhance its privacy, we adopt a general dual-server architecture and further propose a secure clustered FMTL by designing a series of secure two-party computation protocols. The convergence analysis and security analysis is conducted to prove the correctness and security of our methods. Numeric evaluation on public data sets validates that our methods are superior to state-of-the-art methods in dealing with non-IID data while protecting the privacy.
               
Click one of the above tabs to view related content.