Enhancing Federated Learning Robustness Using Data-Agnostic Model Pruning

Page view(s)
50
Checked on Apr 05, 2025
Enhancing Federated Learning Robustness Using Data-Agnostic Model Pruning
Title:
Enhancing Federated Learning Robustness Using Data-Agnostic Model Pruning
Journal Title:
Advances in Knowledge Discovery and Data Mining
Keywords:
Publication Date:
27 May 2023
Citation:
Meng, M. H., Teo, S. G., Bai, G., Wang, K., & Dong, J. S. (2023). Enhancing Federated Learning Robustness Using Data-Agnostic Model Pruning. Lecture Notes in Computer Science, 441–453. https://doi.org/10.1007/978-3-031-33377-4_34
Abstract:
Federated learning enables multiple data owners with a common objective to participate in a machine learning task without sharing their raw data. At each round, clients train local models with their own data and then upload the model parameters to update the global model. This multi-agent form of machine learning has been shown prone to adversarial manipulation by recent studies. Byzantine attackers impersonated as benign clients can stealthily interrupt or destroy the learning process. In this paper, we propose FLAP, a post-aggregation model pruning technique to enhance the Byzantine robustness of federated learning by effectively disabling the malicious and dormant components in the learned neural network models. Our technique is data-agnostic, without requiring clients to submit their dataset or training output, well aligned with the data locality of federated learning. FLAP is performed by the server right after the aggregation, which renders it compatible with an arbitrary aggregation algorithm and existing defensive techniques. Our empirical study demonstrates the effectiveness of FLAP under various settings. It reduces the error rate by up to 10.2% against the state-of-the-art adversarial models. Moreover, FLAP also manages to increase the average accuracy by up to 22.1% against different adversarial settings, mitigating the adversarial impacts while preserving learning fidelity.
License type:
Publisher Copyright
Funding Info:
There was no specific funding for the research done
Description:
This version of the article has been accepted for publication, after peer review and is subject to Springer Nature’s AM terms of use, but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record is available online at: http://dx.doi.org/10.1007/978-3-031-33377-4_34
ISSN:
9783031333774
ISBN:
9783031333767
Files uploaded: