LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Enhancing Recommendation Capabilities Using Multi-Head Attention-Based Federated Knowledge Distillation

Photo from wikipedia

As the internet and mobile computing have advanced, recommendation algorithms are used to manage large amounts of data. However, traditional recommendation systems usually require collecting user data on a central… Click to show full abstract

As the internet and mobile computing have advanced, recommendation algorithms are used to manage large amounts of data. However, traditional recommendation systems usually require collecting user data on a central server, which may expose user privacy. Furthermore, data and models from different organizations may be proprietary and cannot be shared directly, leading to data isolation. To address these challenges, we propose a method that combines federated learning (FL) with the recommendation system using a federated knowledge distillation algorithm based on a multi-head attention mechanism. In the proposed approach, knowledge distillation is introduced on the basis of FL to induce the training of the local network and facilitate knowledge transfer. Further, to address the non-independent identical distribution of training samples in FL, Wasserstein distance and regularization terms are incorporated into the objective function of federated knowledge distillation to reduce the distribution difference between server and client networks. A multi-head attention mechanism is used to enhance user encoding information. A combined adaptive learning rate is adopted to further improve the convergence. Compared to the benchmark model, this approach demonstrates significant improvements, with accuracy enhanced up to 10%, model training time shortened by approximately 45%, and average error and NDCG values decreased by 10%.

Keywords: multi head; federated knowledge; knowledge distillation; recommendation

Journal Title: IEEE Access
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.