LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Risk Minimization Against Transmission Failures of Federated Learning in Mobile Edge Networks

Photo from wikipedia

A variety of modern AI products essentially require raw user data for training diverse machine learning models. With the increasing concern on data privacy, federated learning, a decentralized learning framework,… Click to show full abstract

A variety of modern AI products essentially require raw user data for training diverse machine learning models. With the increasing concern on data privacy, federated learning, a decentralized learning framework, enables privacy-preserving training of models by iteratively aggregating model updates from participants, instead of aggregating raw data. Since all the participants, i.e., mobile devices, need to transfer their local model updates concurrently and iteratively over mobile edge networks, the network is easily overloaded, leading to a high risk of transmission failures. Although previous works on transmission protocols have already tried their best to avoid transmission collisions, the number of iterative concurrent transmissions should be fundamentally decreased. Inspired by the fact that raw data are often generated unevenly among devices, those devices with a small proportion of data could be properly excluded since they have little effect on the convergence of models. To further guarantee the accuracy of models, we propose to properly select a subset of devices as participants to ensure the given proportion of involved data. Correspondingly, we propose to minimize the risk against the transmission failures during model updates. Afterwards, we design a randomized algorithm ( $ran$ RFL) to choose suitable participants by using a series of delicately calculated probabilities, and prove that the result is concentrated on its optimum with high probability. Extensive simulations show that through delicate participant selection, $ran$ RFL decreases the maximal error rate of model updates by up to 38.3% compared with the state-of-the-art schemas.

Keywords: federated learning; model updates; transmission failures; edge networks; transmission; mobile edge

Journal Title: IEEE Access
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.