Recent studies have demonstrated the potentials of federated learning (FL) in achieving cooperative and privacy-preserving data analytics. It would also be promising if FL can be employed in vehicular ad… Click to show full abstract
Recent studies have demonstrated the potentials of federated learning (FL) in achieving cooperative and privacy-preserving data analytics. It would also be promising if FL can be employed in vehicular ad hoc networks (VANETs) for cooperative learning tasks, such as steering angle prediction, trajectory prediction, drivable road detection, etc., among integrated vehicles. However, since VANETs are characterized by ad hoc cooperating vehicles with non-independent and identically distributed (Non-IID) data, directly employing existing FL frameworks to VANETs may cause extensive communication overhead and compromised model performance. Further, most of the existing deep learning models incorporated in FL frameworks rely heavily on data with manual annotations, leading to a huge labor cost. To address these issues, in this paper we propose an efficient and effective Federated End-to-End Learning framework for cooperative learning tasks in VANETs, named FEEL. Specifically, we first formulate a distributed optimization problem for cooperative deep learning tasks with Non-IID data in multi-hop cluster VANETs. Second, two algorithms for inter-cluster learning and inner-cluster learning are respectively designed, to reduce the communication overhead and fit Non-IID data. Third, a Paillier-based communication protocol is crafted, allowing secure model parameter updates at the central server without knowing the real updates at each cooperating base station. Extensive experiments on two real-world datasets are conducted by considering various data distributions and VANET topologies, demonstrating the high efficiency and effectiveness of the proposed FEEL framework in both regression and classification tasks.
               
Click one of the above tabs to view related content.