计算机科学 ›› 2025, Vol. 52 ›› Issue (12): 314-320.doi: 10.11896/jsjkx.241100085
彭姣1, 常永娟1, 严韬2, 游张政2, 宋美娜2, 朱一凡2, 张鹏飞1, 贺月1, 张博1, 欧中洪3
PENG Jiao1, CHANG Yongjuan1, YAN Tao2, YOU Zhangzheng2, SONG Meina2, ZHU Yifan2, ZHANG Pengfei1, HE Yue1, ZHANG Bo1, OU Zhonghong3
摘要: 近年来,深度学习、移动设备及物联网技术的快速发展,导致在边缘设备上进行模型推理和数据存储的需求激增。传统的集中式模型训练方法受限于数据量、通信带宽及用户数据隐私等问题,无法有效应对新的挑战。为此,联邦学习技术应运而生。联邦学习允许边缘设备基于本地数据训练模型,并上传模型参数至中央服务器进行聚合与分发,保证数据在不出各方可信域的前提下进行联合建模,并进一步发展了分布式联邦学习以解决时延、带宽限制及单点故障风险等问题。受限于真实网络环境下的网络延迟和带宽等因素,联邦学习的训练效率受到严重影响,造成多方联合建模困难。针对这一问题,提出一种对时延敏感的去中心化联邦学习算法DBFedAvg,通过动态选择算法选取平均时延较小的节点作为主节点,降低通信成本,提高全局模型训练性能,加速模型收敛。Sprint网络等场景下的实验结果,验证了所提方法在通信成本和模型收敛性能等方面带来了巨大提升。
中图分类号:
| [1]MCMAHAN B,MOORE E,RAMAGE D,et al.Communica-tion-efficient learning of deep networks from decentralized data[C]//Artificial Intelligence and Statistics.PMLR,2017:1273-1282. [2]VANHAESEBROUCK P,BELLET A,TOMMASI M.Decentralized collaborative learning of personalized models over networks[C]//Artificial Intelligence and Statistics.PMLR,2017:509-517. [3]WARNAT-HERRESTHAL S,SCHULTZE H,SHASTRY KL,et al.Swarm learning for decentralized and confidential clinical machine learning[J].Nature,2021,594(7862):265-270. [4]LI T,SAHU A K,ZAHEER M,et al.Federated optimization in heterogeneous networks[C]//Proceedings of Machine Learning and Systems.2020:429-450. [5]KARIMIREDDY S P,KALE S,MOHRI M,et al.Scaffold:Sto-chastic controlled averaging for federated learning[C]//International conference on machine learning.PMLR,2020:5132-5143. [6]JIANG P,AGRAWAL G.A linear speedup analysis of distributed deep learning with sparse and quantized communication[C]//Proceedings of the 32nd International Conference on Neural Information Processing Systems.Red Hook,NY:Curran Associates Inc.,2018:2530-2541. [7]YU H,YANG S,ZHU S.Parallel restarted SGD with fasterconvergence and less communication:Demystifying why model averaging works for deep learning[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2019:5693-5700. [8]BASU D,DATA D,KARAKUS C,et al.Qsparse-local-SGD:Distributed SGD with quantization,sparsification and local computations[J].IEEE Journal on Selected Areas in Information Theory,2020,1(1):217-226. [9]TANG H,YU C,LIAN X,et al.Doublesqueeze:Parallel stochastic gradient descent with double-pass error-compensated compression[C]//International Conference on Machine Learning.PMLR,2019:6155-6165. [10]KUO T T,OHNO-MACHADO L.Modelchain:Decentralizedprivacy-preserving healthcare predictive modeling framework on private blockchain networks[J].arXiv:1802.01746,2018. [11]WENG J,WENG J,ZHANG J,et al.Deepchain:Auditable and privacy-preserving deep learning with blockchain-based incentive[J].IEEE Transactions on Dependable and Secure Computing,2019,18(5):2438-2455. [12]TRAN A T,LUONG T D,KARNJANA J,et al.An efficientapproach for privacy preserving decentralized deep learning models based on secure multi-party computation[J].Neurocomputing,2021,422:245-262. [13]ROY A G,SIDDIQUI S,PÖLSTERL S,et al.Braintorrent:A peer-to-peer environment for decentralized federated learning[J].arXiv:1905.06731,2019. [14]WANG J,SAHU A K,YANG Z,et al.MATCHA:Speeding up decentralized SGD via matching decomposition sampling[C]//2019 Sixth Indian Control Conference(ICC).IEEE,2019:299-300. [15]LALITHA A,KILINC O C,JAVIDI T,et al.Peer-to-peer federated learning on graphs[J].arXiv:1901.11173,2019. [16]HE C,CEYANI E,BALASUBRAMANIAN K,et al.Spre-adgnn:Serverless multi-task federated learning for graph neural networks[J].arXiv:2106.02743,2021. [17]HEGEDŰS I,DANNER G,JELASITY M.Gossip learning as a decentralized alternative to federated learning[C]//IFIP International Conference on Distributed Applications and Interoperable Systems.Cham:Springer,2019:74-90. [18]TIAN C,LI L,TAM K,et al.Breaking the Memory Wall for Heterogeneous Federated Learning via Model Splitting[J].IEEE Transactions on Parallel and Distributed Systems,2024,35(12):2513-2526. [19]ZHANG Y,BEHNIA R,YAVUZ A A,et al.Uncovering At-tacks and Defenses in Secure Aggregation for Federated Deep Learning[J].arXiv:2410.09676,2024. [20]OlivierTilmans.IPMininet’s documentation! [EB/OL](2022-05-28)[2024-03-01].https://ipmininet.readthedocs.io/en/latest/. [21]DE OLIVEIRA R L S,SCHWEITZER C M,SHINODA A A,et al.Using mininet for emulation and prototyping software-defined networks[C]//2014 IEEE Colombian conference on communications and computing(COLCOM).IEEE,2014:1-6. [22]MILLS J,HU J,MIN G.Communication-efficient federatedlearning for wireless edge intelligence in IoT[J].IEEE Internet of Things Journal,2019,7(7):5986-5994. [23]KRIZHEVSKY A,HINTON G.Learning Multiple Layers ofFeatures from Tiny Images[EB/OL].http://www.cs.utoronto.ca/~kriz/learning-features-2009-TR.pdf. |
|
||