账号登录  
无需注册,快速开始 注册账号
   首页学术资源期刊论文 --> 细览
0

A Dropout-Tolerated Privacy-Preserving Method for Decentralized Crowdsourced Federated Learning
作者: Chen, Tao; Wang, Xiaofen; Dai, Hong-Ning; Yang, Haomiao
(^_^)

Mobile crowdsourcing federated learning (FL-MCS) allows a requester to outsource its model-training tasks to other workers who have the desired data as well as strong computing power. FL-MCS can thereby overcome the limitations of computing capability as well as the data availability of participants. However, FL-MCS still faces the problem of workers' data privacy leakage when diverse malicious attacks (e.g., gradient inference attacks) are launched. To address these problems, some privacy-preserving FL-MCS (PPFL-MCS) schemes are proposed to aggregate local models at a central server. Unfortunately, these schemes are vulnerable to single-point-of-failure and other malicious attacks at the central server. Meanwhile, the workers may drop from the online task due to the erratic communication network in PPFL-MCS schemes, thereby resulting in the failure of the entire model aggregation. To solve these issues, we propose a novel dropout-tolerated and privacy-preserving decentralized FL-MCS scheme, namely, dropout-tolerated decentralized PPFL-MCS based on blockchain. Specifically, we define a novel cryptographic primitive, i.e., ID-based Aggregated Decryptable Broadcast Encryption (AD-IBBE) based on traditional ID-based broadcast encryption. In AD-IBBE, the senders' ciphertexts can only be decrypted by themselves while the aggregated ciphertexts can be decrypted by all receivers in the broadcast group. Then, we design a homomorphic AD-IBBE algorithm, which is formally proved to be semantically secure. We next devise the decentralized PPFL-MCS scheme to guarantee the confidentiality of model gradients against internal and external adversaries. Moreover, we design a dropout-tolerated aggregation method to ensure the robustness of our decentralized PPFL-MCS scheme even if some workers lose connection. Extensive experimental results on different models and data sets demonstrate that the proposed scheme guarantees a close model accuracy to the nondropout case. Even when some workers are offline, our scheme still performs more efficiently than existing schemes in terms of dropout aggregation overhead.

关 键 词: Decentralized; dropout tolerated; federated learning ; mobile crowdsourcing; privacy preserving
论文来源: IEEE INTERNET OF THINGS JOURNAL.2024,11(2):1788-1799
语种: 英文
所属领域: 其它行业服务外包
入库时间: 2024-04-08
浏览次数: 1