Computer Science ›› 2025, Vol. 52 ›› Issue (2): 1-7.doi: 10.11896/jsjkx.240100023

;

• Discipline Frontier • Previous Articles     Next Articles

Survey of Communication Efficiency for Federated Learning

ZHENG Jianwen1, LIU Bo1, LIN Weiwei2,3, XIE Jiachen1   

  1. 1 School of Computer Science,South China Normal University,Guangzhou 510631,China
    2 Pengcheng Laboratory,Shenzhen,Guangdong 518066,China
    3 School of Computer Science and Engineering,South China University of Technology,Guangzhou 510640,China
  • Received:2024-01-02 Revised:2024-05-22 Online:2025-02-15 Published:2025-02-17
  • About author:ZHENG Jianwen,born in 2000,postgraduate.His main research interest is federated learning.
    LIN Weiwei,born in 1980,Ph.D,professor,is a distinguished member of CCF(No.37313D).His main research interestes include cloud computing,big data technology and AI application technology.
  • Supported by:
    National Natural Science Foundation of China(62072187),Guangzhou Development Zone Science and Technology Project(2023GH02) and Major Key Project of PCL,China(PCL2023A09).

Abstract: As a distributed machine learning paradigm,federated learning(FL) aims to collaboratively train machine learning models on decentralized data sources while ensuring data privacy.However,in practical applications,FL faces the challenge of communication efficiency,as significant communication is required in each iteration to transmit model parameters and gradient updates,leading to communication costs far surpass computation costs.Thus,effectively enhancing communication efficiency poses a significant challenge in FL research.This paper mainly introduces the importance of communication efficiency in FL,and divides the existing research on FL communication efficiency into client selection,model compression,network topology reconstruction,and the combination of multiple technologies according to different emphases.On the basis of the existing research on FL communication efficiency,this paper summarizes the difficulties and challenges in communication efficiency in the development of FL,and explores the future research direction of FL communication efficiency.

Key words: Federated learning, Communication efficiency, Client selection, Model compression, Network topology refactoring

CLC Number: 

  • TP181
[1]ZHANG C,XIE Y,BAI H,et al.A survey on federated learning[J].Knowledge-Based Systems,2021,216:106775.
[2]MAMMEN P M.Federated learning:Opportunities and challenges [EB/OL].(2021-01-14) [2024-03-12].https://arxiv.org/abs/2101.05428.
[3]SHAHID O,POURIYEH S,PARIZI R M,et al.Communication efficiency in federated learning:Achievements and challenges [EB/OL].(2021-07-23)[2024-03-12].https://arxiv.org/abs/2107.10996.
[4]ASAD M,MOUSTAFA A,ITO T,et al.Evaluating the communication efficiency in federated learning algorithms[C]//2021 IEEE 24th International Conference on Computer Supported Cooperative Work in Design(CSCWD).IEEE,2021:552-557.
[5]WEN J,ZHANG Z,LAN Y,et al.A survey on federated lear-ning:challenges and applications[J].International Journal of Machine Learning and Cybernetics,2023,14(2):513-535.
[6]KAIROUZ P,MCMAHAN H B,AVENT B,et al.Advances and open problems in federated learning[J].Foundations and Trends in Machine Learning,2021,14(1/2):214-217.
[7]LIU B,LV N,GUO Y,et al.Recent Advances on Federated Learning:A Systematic Survey[EB/OL].(2023-01-03)[2024-03-12].https://arxiv.org/abs/2301.01299.
[8]POURIYEH S,SHAHID O,PARIZI R M,et al.Securesmart communication efficiency in federated learning:Achievements and challenges[J].Applied Sciences,2022,12(18):8980.
[9]SHAHEEN M,FAROOQ M S,UMER T,et al.Applications of federated learning;Taxonomy,challenges,and research trends[J].Electronics,2022,11(4):670.
[10]WNAG L P,WEI W,LI B.CMFL:Mitigating communicationoverhead for federated learning[C]//2019 IEEE 39th International Conference on Distributed Computing Systems(ICDCS).IEEE,2019:954-964.
[11]LI C,ZENG X,ZHANG M,et al.PyramidFL:A fine-grainedclient selection framework for efficient federated learning[C]//Proceedings of the 28th Annual International Conference on Mobile Computing and Networking.2022:158-171.
[12]WANG H,KAPLAN Z,NIU D,et al.Optimizing federatedlearning on non-iid data with reinforcement learning[C]//IEEE INFOCOM 2020-IEEE Conference on Computer Communications.IEEE,2020:1698-1707.
[13]NISHIO T,YONETANI R.Client selection for federated lear-ning with heterogeneous resources in mobile edge[C]//2019 IEEE International Conference on Communications(ICC).IEEE,2019:1-7.
[14]DU C,XIAO J,GUO W.Bandwidth constrained client selection and scheduling for federated learning over SD-WAN[J].IET Communications,2022,16(2):187-194.
[15]XU J,DU W,JIN Y,et al.Ternary compression for communication-efficient federated learning[J].IEEE Transactions on Neural Networks and Learning Systems,2020,33(3):1162-1176.
[16]MALEKIJOO A,FADAEIESLAM M J,MALEKIJOU H,et al.Fedzip:A compression framework for communication-efficient federated learning[EB/OL].(2021-02-02)[2024-03-12].https://arxiv.org/abs/2102.01593.
[17]AJI A F,HEAFIELD K.Sparse Communication for Distributed Gradient Descent[C]//Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing.2017:440-445.
[18]HUFFMAN D A.A method for the construction of minimum-redundancy codes[J].Proceedings of the IRE,1952,40(9):1098-1101.
[19]ZHU X,WANG J,CHEN W,et al.Model compression and privacy preserving framework for federated learning[J].Future Generation Computer Systems,2023,140:376-389.
[20]ZHAO J,ZHU H,WANG F,et al.CORK:A privacy-preserving and lossless federated learning scheme for deep neural network[J].Information Sciences,2022,603:190-209.
[21]HE Y,WANG H P,ZENK M,et al.CosSGD:Communication-efficient federated learning with a simple cosine-based quantization [EB/OL].(2020-12-15)[2024-03-12].https://arxiv.org/abs/2012.08241.
[22]HU C,JIANG J,WANG Z.Decentralized federated learning:A segmented gossip approach[EB/OL].(2019-08-21)[2024-03-12].https://arxiv.org/abs/1908.07782.
[23]WANG Z,HU Y,YAN S,et al.Efficient ring-topology decentralized federated learning with deep generative models for medical data in ehealthcare systems[J].Electronics,2022,11(10):1548.
[24]ABDELLATIF A A,MHAISEN N,MOHAMED A,et al.Communication-efficient hierarchical federated learning for IoT he-terogeneous systems with imbalanced data[J].Future Generation Computer Systems,2022,128:406-419.
[25]LU X,LIAO Y,LIO P,et al.Privacy-preserving asynchronousfederated learning mechanism for edge network computing[J].IEEE Access,2020,8:48970-48981.
[26]LIU S,YU G,YIN R,et al.Joint model pruning and device selection for communication-efficient federated edge learning[J].IEEE Transactions on Communications,2021,70(1):231-244.
[27]BUYUKATES B,ULUKUS S.Timely communication in fede-rated learning[C]//IEEE Conference on Computer Communications Workshops(INFOCOM WKSHPS).IEEE,2021:1-6.
[28]CALDAS S,KONECNY J,MCMAHAN H B,et al.Expanding the reach of federated learning by reducing client resource requirements[EB/OL].(2018-12-18)[2024-03-12].https://ar-xiv.org/abs/1812.07210.
[29]CHEN Y,SUN X,JIN Y.Communication-efficient federated deep learning with layerwise asynchronous model update and temporally weighted aggregation[J].IEEE Transactions on Neural Networks and Learning Systems,2019,31(10):4229-4238.
[30]PARK S,SUH Y,LEE J.FedPSO:Federated learning using particle swarm optimization to reduce communication costs[J].Sensors,2021,21(2):600.
[31]MILLS J,HU J,MIN G.Communication-efficient federatedlearning for wireless edge intelligence in IoT[J].IEEE Internet of Things Journal,2019,7(7):5986-5994.
[32]ASAD M,MOUSTAFA A,ITO T.FedOpt:Towards communication efficiency and privacy preservation in federated learning[J].Applied Sciences,2020,10(8):2864.
[33]LI C,LI G,VARSHNEY P K.Communication-efficient federated learning based on compressed sensing[J].IEEE Internet of Things Journal,2021,8(20):15531-15541.
[34]WU C,WU F,LYU L,et al.Communication-efficient federated learning via knowledge distillation[J].Nature Communications,2022,13(1):2032.
[35]ZHU H,JIN Y.Multi-objective evolutionary federated learning[J].IEEE Transactions on Neural Networks and Learning Systems,2019,31(4):1310-1322.
[36]MARFOQ O,XU C,NEGLIA G,et al.Throughput-optimal topology design for cross-silo federated learning[J].Advances in Neural Information Processing Systems,2020,33:19478-19487.
[37]GUPTA R,ALAM T.Survey on federated-learning approaches in distributed environment[J].Wireless Personal Communications,2022,125(2):1631-1652.
[38]ZHANG J,ZHU H,WANG F,et al.Security and privacythreats to federated learning:Issues,methods,and challenges[J].Security and Communication Networks,2022,2022(1):2886795.
[39]SONG D,SHEN G,GAO D,et al.Fast heterogeneous federated learning with hybrid client selection[C]//Uncertainty in Artificial Intelligence.PMLR,2023:2006-2015.
[40]XUE Y,LAU V.Riemannian Low-Rank Model Compression for Federated Learning with Over-the-Air Aggregation[J/OL].https://onlinelibrary.wiley.com/doi/10.1155/2022/2886795.
[41]ZITZLER E,LAUMANNS M,THIELE L.SPEA2:Improving the strength Pareto evolutionary algorithm[C]//Proceedings of the EUROGEN'2001.2001.
[42]COELLO C A C,LECHUGA M S.MOPSO:A proposal formultiple objective particle swarm optimization[C]//Procee-dings of the 2002 Congress on Evolutionary Computation,CEC'02(Cat,No,02TH8600).IEEE,2002,2:1051-1056.
[43]PUTRA M A P,PUTRI A R,ZAINUDIN A,et al.Acs:Accuracy-based client selection mechanism for federated industrial iot[J].Internet of Things,2023,21:100657.
[44]RAI S,KUMARI A,PRASAD D K.Client selection in federatedlearning under imperfections in environment[J].AI,2022,3(1):124-145.
[45]CHAI Z Y,YANG C,LI Y L.Communication efficiency optimization in federated learning based on multi-objective evolutio-nary algorithm[J].Evolutionary Intelligence,2023,16(3):1033-1044.
[1] WANG Chundong, ZHANG Qinghua, FU Haoran. Federated Learning Privacy Protection Method Combining Dataset Distillation [J]. Computer Science, 2025, 52(6A): 240500132-7.
[2] JIANG Yufei, TIAN Yulong, ZHAO Yanchao. Persistent Backdoor Attack for Federated Learning Based on Trigger Differential Optimization [J]. Computer Science, 2025, 52(4): 343-351.
[3] LUO Zhengquan, WANG Yunlong, WANG Zilei, SUN Zhenan, ZHANG Kunbo. Study on Active Privacy Protection Method in Metaverse Gaze Communication Based on SplitFederated Learning [J]. Computer Science, 2025, 52(3): 95-103.
[4] HU Kangqi, MA Wubin, DAI Chaofan, WU Yahui, ZHOU Haohao. Federated Learning Evolutionary Multi-objective Optimization Algorithm Based on Improved NSGA-III [J]. Computer Science, 2025, 52(3): 152-160.
[5] WANG Ruicong, BIAN Naizheng, WU Yingjun. FedRCD:A Clustering Federated Learning Algorithm Based on Distribution Extraction andCommunity Detection [J]. Computer Science, 2025, 52(3): 188-196.
[6] WANG Dongzhi, LIU Yan, GUO Bin, YU Zhiwen. Edge-side Federated Continuous Learning Method Based on Brain-like Spiking Neural Networks [J]. Computer Science, 2025, 52(3): 326-337.
[7] XIE Jiachen, LIU Bo, LIN Weiwei , ZHENG Jianwen. Survey of Federated Incremental Learning [J]. Computer Science, 2025, 52(3): 377-384.
[8] LIU Yuming, DAI Yu, CHEN Gongping. Review of Federated Learning in Medical Image Processing [J]. Computer Science, 2025, 52(1): 183-193.
[9] WANG Xin, XIONG Shubo, SUN Lingyun. Federated Graph Learning:Problems,Methods and Challenges [J]. Computer Science, 2025, 52(1): 362-373.
[10] DUN Jingbo, LI Zhuo. Survey on Transmission Optimization Technologies for Federated Large Language Model Training [J]. Computer Science, 2025, 52(1): 42-55.
[11] XU Jinlong, GUI Zhonghua, LI Jia'nan, LI Yingying, HAN Lin. FP8 Quantization and Inference Memory Optimization Based on MLIR [J]. Computer Science, 2024, 51(9): 112-120.
[12] LI Zhi, LIN Sen, ZHANG Qiang. Edge Cloud Computing Approach for Intelligent Fault Detection in Rail Transit [J]. Computer Science, 2024, 51(9): 331-337.
[13] XU Xiaohua, ZHOU Zhangbing, HU Zhongxu, LIN Shixun, YU Zhenjie. Lightweight Deep Neural Network Models for Edge Intelligence:A Survey [J]. Computer Science, 2024, 51(7): 257-271.
[14] ZHOU Tianyang, YANG Lei. Study on Client Selection Strategy and Dataset Partition in Federated Learning Basedon Edge TB [J]. Computer Science, 2024, 51(6A): 230800046-6.
[15] GAO Yang, CAO Yangjie, DUAN Pengsong. Lightweighting Methods for Neural Network Models:A Review [J]. Computer Science, 2024, 51(6A): 230600137-11.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!