Secure Federated Learning Based on Coded Distributed Computing

Abstract

Federated learning (FL) enables multiple learning devices to exchange their training results and collaboratively develop a shared learning model without revealing their local data, thereby preserving data privacy. However, contemporary FL models have many drawbacks including limited security against malicious learning devices generating arbitrarily erroneous training results. Recently, a promising concept - coded distributed computing (CDC) has been proposed for maintaining security of various distributed systems by adding computational redundancy to the datasets exchanged in these systems. Although the CDC concept has already been adopted in several applications, it is yet to be applied to FL systems. Accordingly, in this paper, we develop the first integrated FL-CDC model that represents a low-complexity approach for enhancing security of FL systems. We implement the model for predicting the traffic slowness in vehicular applications and verify that the model can effectively secure the system even if the number of malicious devices is large.

Publication
2021 IEEE Globecom Workshops (GC Wkshps)