Towards End-to-End Secure and Efficient Federated Learning for XGBoost

Page view(s)
134
Checked on Nov 27, 2024
Towards End-to-End Secure and Efficient Federated Learning for XGBoost
Title:
Towards End-to-End Secure and Efficient Federated Learning for XGBoost
Journal Title:
fl-aaai-2022
DOI:
Keywords:
Publication Date:
02 March 2022
Citation:
Jin, C., wang, J., Teo, SG., Zhang, L., Chan, CS., Hou, Q., Aung, KMM. Towards End-to-End Secure and Efficient Federated Learning for XGBoost. fl-aaai-2022
Abstract:
Federated learning refers to the distributed and privacypreserving collaborative machine learning paradigm, in which multiple independent data owners jointly train on certain machine learning models without revealing their private data information to each other. In this paper, we study federated learning on XGBoost models in vertical data partition settings where the data owners share a common set of training samples, and each data owner possesses a disjoint subset of the features. We propose CryptoBoost, a federated XGBoost system based on multi-party homomorphic encryption techniques. CryptoBoost outperforms previous works in majorly three aspects. 1) CryptoBoost is end-to-end secure that the models are trained and stored in a completely encrypted and private manner. 2) CryptoBoost eliminates any central or privileged node that knows or controls more information than the other nodes, and the federated learning and inference processes are done in a fully decentralized way. 3) We propose a set of new secure computation algorithms and protocols for CryptoBoost, which achieve improved performance and communication efficiency compared with existing approaches.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the RIE2020 - Advanced Manufacturing and Engineering (AME) Programmatic Programme
Grant Reference no. : A19E3b0099
Description:
ISBN:

Files uploaded:

File Size Format Action
fl-aaai-22-paper-25-amended.pdf 291.72 KB PDF Open