Computer Science ›› 2022, Vol. 49 ›› Issue (12): 5-16.doi: 10.11896/jsjkx.220300204

• Federated Leaming • Previous Articles     Next Articles

Study on Transmission Optimization for Hierarchical Federated Learning

ZOU Sai-lan1,2, LI Zhuo1,2, CHEN Xin2   

  1. 1 Beijing Key Laboratory of Internet Culture and Digital Dissemination Research(Beijing Information Science & Technology University),Beijing 100101,China
    2 School of Computer Science,Beijing Information Science & Technology University,Beijing 100101,China
  • Received:2022-03-21 Revised:2022-07-06 Published:2022-12-14
  • About author:ZOU Sai-lan,born in 1997,postgra-duate.Her main research interests include edge computing and so on.LI Zhuo,born in 1983,Ph.D,associate professor,is a member of China Computer Federation.His main research interests include mobile wireless network and distributed computing.
  • Supported by:
    National Natural Science Foundation of China(61872044)and Beijing Municipal Program for Top Talent.

Abstract: Compared with traditional machine learning,federated learning effectively solves the problems of user data privacy and security protection,but a large number of model exchanges between massive nodes and cloud servers will produce high communication costs.Therefore,cloud-edge-side layered federated learning has received more and more attention.In hierarchical federated learning,D2D and opportunity communication can be used for model cooperation training among mobile nodes.Edge server performs local model aggregation,while cloud server performs global model aggregation.In order to improve the convergence rate of the model,the network transmission optimization technique for hierarchical federated learning is studied.This paper introduces the concept and algorithm principle of hierarchical federated learning,summarizes the key challenges that cause network communication overhead,summarizes and analyzes six network transmission optimization methods,such as selecting appropriate nodes,enhancing local computing,reducing the upload number of local model updates,compressing model updates decentralized training and parameter aggregation oriented transimission.Finally,the future research direction is summarized and discussed.

Key words: Hierarchical federated learning, Transmission optimization, Communication overhead, Node selection, Model compression

CLC Number: 

  • TP393
[1]MCMAHAN B,MOORE E,RAMAGE D,et al.Communica-tion-efficient learning of deep networks from decentralized data[C]//Artificial Intelligence and Statistics.PMLR,2017:1273-1282.
[2]PARK J,SAMARAKOOM S,BENNIS M,et al.Wireless network intelligence at the edge[J].Proceedings of the IEEE,2019,107(11):2204-2239.
[3]LIU L,ZHANG J,SONG S H,et al.Client-edge-cloud hierarchical federated learning[C]//IEEE International Conference on Communications(ICC).IEEE,2020:1-6.
[4]ABDELLATIF A A,MHAISEN N,MOHAMED A,et al.Communication-efficient hierarchical federated learning for IoT hete-rogeneous systems with imbalanced data[J].Future Generation Computer Systems,2022,128:406-419.
[5]SAADAT H,ABOUMADI A,MOHAMED A,et al.Hierarchical federated learning for collaborative IDS in IoT applications[C]//2021 10th Mediterranean Confe-rence on Embedded Computing(MECO).IEEE,2021:1-6.
[6]ZHANG J,LI M,ZENG S,et al.A survey on security and privacy threats to federated learning[C]//2021 International Confe-rence on Networking and Network Applications(NaNA).IEEE,2021:319-326.
[7]YIN X,ZHU Y,HU J.A comprehensive survey of privacy-preserving federated learning:A taxonomy,review,and future directions[J].ACM Computing Surveys(CSUR),2021,54(6):1-36.
[8]LIM W Y B,LUONG N C,HOANG D T,et al.Federated lear-ning in mobile edge networks:A comprehensive survey[J].IEEE Communications Surveys & Tutorials,2020,22(3):2031-2063.
[9]SONG M,WANG Z,ZHANG Z,et al.Analyzing user-level privacy attack against federated learning[J].IEEE Journal on Selected Areas in Communications,2020,38(10):2430-2444.
[10]MELIS L,SONG C,DE CRISTOFARO E,et al.Exploiting unintended feature leakage in collaborative learning[C]//2019 IEEE Symposium on Security and Privacy(SP).IEEE,2019:691-706.
[11]HITAJ B,ATENIESE G,PEREZ-CRUZ F.Deep models underthe GAN:information leakage from collaborative deep learning[C]//Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security.2017:603-618.
[12]LI Z,SHARMA V,MOHANTY S P.Preserving data privacy via federated learning:Challenges and solutions[J].IEEE Consumer Electronics Magazine,2020,9(3):8-16.
[13]SHI L,SHU J,ZHANG W,et al.HFL-DP:Hierarchical Federated Learning with Differential Privacy[C]//2021 IEEE Global Communications Conference(GLOBECOM).IEEE,2021:1-7.
[14]LU Y,HUANG X,DAI Y,et al.Differentially private asyn-chronous federated learning for mobile edge computing in urban informatics[J].IEEE Transactions on Industrial Informatics,2019,16(3):2134-2143.
[15]LU Y,HUANG X,DAI Y,et al.Federated learning for data privacy preservation in vehicular cyber-physical systems[J].IEEE Network,2020,34(3):50-56.
[16]KAISSIS G A,MAKOWSKI M R,RUCKERT D,et al.Secure,privacy-preserving and federated machine learning in medical imaging[J].Nature Machine Intelligence,2020,2(6):305-311.
[17]GEYER R C,KLEIN T,NABI M.Differentially private federated learning:A client level perspective[J].arXiv:1712.07557,2017.
[18]BONAWITZ K,IVANOV V,KREUTER B,et al.Practical secure aggregation for privacy-preserving machine learning[C]//Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security.2017:1175-1191.
[19]MCMAHAN H B,RAMAGE D,TALWAR K,et al.Learning differentially private recurrent language models[J].arXiv:1710.06963,2017.
[20]SHARMA P K,PARK J H,CHO K.Blockchain and federated learning-based distributed computing defence framework for sustainable society[J/OL].Sustainable Cities and Society,2020,59:102220.https://DOI.ORG/10.1016/J.SCS.2020.102220.
[21]LU Y,HUANG X,ZHANG K,et al.Blockchain empoweredasynchronous federated learning for secure data sharing in internet of vehicles[J].IEEE Transactions on Vehicular Techno-logy,2020,69(4):4298-4311.
[22]SARIKAYA Y,ERCETIN O.Motivating workers in federated learning:A stackelberg game perspective[J].IEEE Networking Letters,2019,2(1):23-27.
[23]ZHAN Y,LI P,QU Z,et al.A learning-based incentive mechanism for federated learning[J].IEEE Internet of Things Journal,2020,7(7):6360-6368.
[24]DENG Y,LYU F,REN J,et al.Fair:Quality-aware federated learning with precise user incentive and model aggregation[C]//IEEE INFOCOM 2021-IEEE Conference on Computer Communications.IEEE,2021:1-10.
[25]TANG M,WONG V W S.An incentive mechanism for cross-silo federated learning:A public goods perspective[C]//IEEE INFOCOM 2021-IEEE Conference on Computer Communications.IEEE,2021:1-10.
[26]KIM Y J,HONG C S.Blockchain-based node-aware dynamic weighting methods for improving federated learning perfor-mance[C]//2019 20th Asia-Pacific Network Operations and Management Symposium(APNOMS).IEEE,2019:1-4.
[27]KONECNY J,MCMAHAN H B,YU F X,et al.Federatedlearning:Strategies for improving communication efficiency[J].arXiv:1610.05492,2016.
[28]SHAHID O,POURIYEH S,PARIZI R M,et al.Communication Efficiency in Federated Learning:Achievements and Challenges[J].arXiv:2107.10996,2021.
[29]MOTHUKURI V,PARIZI R M,POURIYEH S,et al.A survey on security and privacy of federated learning[J].Future Generation Computer Systems,2021,115:619-640.
[30]WANG L P,WANG W,LI B.CMFL:Mitigating communication overhead for federated learning[C]//2019 IEEE 39th International Conference on Distributed Computing Systems(ICDCS).IEEE,2019:954-964.
[31]CHEN Y,SUN X,JIN Y.Communication-efficient federateddeep learning with layerwise asynchronous model update and temporally weighted aggregation[J].IEEE Transactions on Neural Networks and Learning Systems,2019,31(10):4229-4238.
[32]CHEN M,GUNDUZ D,HUANG K,et al.Distributed learning in wireless networks:Recent progress and future challenges[J].IEEE Journal on Selected Areas in Communications,2021,39(12):3579-3605.
[33]LI T,SAHU A K,ZAHEER M,et al.Federated optimization in heterogeneous networks[J].Proceedings of Machine Learning and Systems,2020,2:429-450.
[34]SATTLER F,WIEDEMANN S,MULLER K R,et al.Robust and communication-efficient federated learning from non-IID data[J].IEEE Transactions on Neural Networks and Learning Systems,2019,31(9):3400-3413.
[35]ALI S,SAAD W,RAJATHEVA N,et al.6G white paper on machine learning in wireless communication networks[J].ar-Xiv:2004.13875,2020.
[36]HOSSEINALIPOUR S,BRINTON C G,AGGARWAL V,et al.From federated learning to fog learning:Towards large-scale distributed machine learning in heterogeneous wireless networks[J].arXiv:2006.03594,2020.
[37]HOSSEINALIPOUR S,BRINTON C G,AGGARWAL V,et al.From federated to fog learning:Distributed machine learning over heterogeneous wireless networks[J].IEEE Communications Magazine,2020,58(12):41-47.
[38]LIN F P C,HOSSEINALIPOUR S,AZAM S S,et al.Semi-decentralized federated learning with cooperative D2D local model aggregations[J].IEEE Journal on Selected Areas in Communications,2021,39(12):3851-3869.
[39]LOU S,CHEN X,WU Q,et al.HFEL:Joint edge association and resource allocation for cost-efficient hierarchical federated edge learning[J].IEEE Transactions on Wireless Communications,2020,19(10):6535-6548.
[40]MHAISEN N,ABDELLATIF A A,MOHAMED A,et al.Optimal user-edge assignment in hierarchical federated learning based on statistical properties and network topology constraints[J].IEEE Transactions on Network Science and Engineering,2021,9(1):55-66.
[41]SONG J,KOUNTOURIS M.Optimal number of edge devices in distributed learning over wireless channels[C]//2020 IEEE 21st International Workshop on Signal Processing Advances in Wireless Communications(SPAWC).IEEE,2020:1-5.
[42]CHEN Z,LIAO W,HUA K,et al.Towards asynchronous fede-rated learning for heterogeneous edge-powered internet of things[J].Digital Communications and Networks,2021,7(3):317-326.
[43]ABDULRAHMAN S,TOUT H,MOURAD A,et al.FedM-CCS:Multicriteria client selection model for optimal IOT fede-rated learning[J].IEEE Internet of Things Journal,2020,8(6):4723-4735.
[44]CHEN Q,YOU Z,JIANG H.Semi-asynchronous Hierarchical Federated Learning for Cooperative Intelligent Transportation Systems[J].arXiv:2110.09073,2021.
[45]LIM W Y B,NG J S,XIONG Z,et al.Dynamic edge association and resource allocation in self-organizing hierarchical federated learning networks[J].IEEE Journal on Selected Areas in Communications,2021,39(12):3640-3653.
[46]LIM W Y B,XIONG Z,MIAO C,et al.Hierarchical incentive mechanism design for federated machine learning in mobile networks[J].IEEE Internet of Things Journal,2020,7(10):9575-9588.
[47]LIM W Y B,XIONG Z,NIYATO D,et al.Incentive mechanism design for federated learning in the Internet of vehicles[C]//2020 IEEE 92nd Vehicular Technology Conference(VTC2020-Fall).IEEE,2020:1-5.
[48]LIM W Y B,NG J S,XIONG Z,et al.Decentralized edge intelligence:A dynamic resource allocation framework for hierarchical federated learning[J].IEEE Transactions on Parallel and Distributed Systems,2021,33(3):536-550.
[49]MCMAHAN B,MOORE E,RAMAGE D,et al.Communica-tion-efficient Learning of deep networks from decentralized data[C]//Artificial Intelligence and Statistics.PMLR,2017:1273-1282.
[50]YAO X,HUANG C,SUN L,et al.Two-stream federated learning:Reduce the communication costs[C]//2018 IEEE Visual Communications and Image Processing(VCIP).IEEE,2018:1-4.
[51]LIU L,ZHANG J,SONG S H,et al.Edge-assisted hierarchical federated learning with non-IID data[J].arXiv:1905.06641,2019.
[52]WANG Z,XU H,LIU J,et al.Resource-Efficient FederatedLearning with Hierarchical Aggregation in Edge Computing[C]//IEEE INFOCOM 2021-IEEE Conference on Computer Communications.IEEE,2021:1-10.
[53]REISIZADEH A,MOKHTARI A,HASSANI H,et al.Fedpaq:A communication-efficient federated learning method with perio-dic averaging and quantization[C]//International Conference on Artificial Intelligence and Statistics.PMLR,2020:2021-2031.
[54]KARIMIREDDY S P,KALE S,MOHRI M,et al.Scaffold:Stochastic controlled averaging for federated learning[C]//International Conference on Machine Learning.PMLR,2020:5132-5143.
[55]BRIGGS C,FAN Z,ANDRAS P.Federated learning with hie-rarchical clustering of local updates to improve training on non-IID data[C]//IEEE International Joint Conference on Neural Networks(IJCNN).IEEE,2020:1-9.
[56]TAO Z Y,LI Q.ESGD:Communication Efficient DistributedDeep Learning on the Edge[C]//Workshop on Hot Topics in Edge Computing.2018:1-6.
[57]NGUYEN H T,SEHWAG V,HOSSEINALIPOUR S,et al.Fast-convergent federated learning[J].IEEE Journal on Selec-ted Areas in Communications,2020,39(1):201-218.
[58]WANG S,LEE M,HOSSEINALIPOUR S,et al.Device sam-pling for heterogeneous federated learning:Theory,algorithms,and implementation[C]//IEEE INFOCOM 2021-IEEE Conference on Computer Communications.IEEE,2021:1-10.
[59]KHALED A,RICHTARIK P.Gradient descent with com-pressed interates[J].arXiv:1909.04716,2019.
[60]TANG H,LIAN X,QIU S,et al.DeepSqueeze:Parallel stochastic gradient descent with double-pass error-compensated compression[J].arXiv:1907.07346,2019.
[61]KARIMIREDDY S P,REBJOCK Q,STICH S,et al.Error feedback fixes sign SGD and other gradient compression schemes[C]//International Conference on Machine Learning.PMLR,2019:3252-3261.
[62]AMIRI M M,GUNDUZ D,KULKARNI S R,et al.Federatedlearning with quantized global model updates[J].arXiv:2006.10672,2020.
[63]ALISTARH D,GRUBIC D,LI J,et al.QSGD:Communication-efficient SGD via Gradient Quantization and Encoding[J].Advances in Neural Information Processing Systems,2017,30:1709-1720.
[64]SURESH A T,FELIX X Y,KUMAR S,et al.Distributed mean estimation with limited communication[C]//International Conference on Machine Learning.PMLR,2017:3329-3337.
[65]ZHANG X,FANG M,LIU J,et al.Private and communication-efficient edge learning:a sparse differential gaussian-masking distributed SGD approach[C]//Proceedings of the Twenty-First International Symposium on Theory,Algorithmic Foundations,and Protocol Design for Mobile Networks and Mobile Computing.2020:261-270.
[66]HU R,GONG Y,GUO Y.CPFed:communication-efficient and privacy-preserving federated learning[J].arXiv:2003.13761,2020.
[67]LIU L,ZHANG J,SONG S,et al.Hierarchical quantized federated learning:Convergence analysis and system design[J].arXiv:2103.14272,2021.
[68]ZHENG H,GAO M,CHEN Z,et al.A distributed hierarchical deep computation model for federated learning in edge computing[J].IEEE Transactions on Industrial Informatics,2021,17(12):7946-7956.
[69]LU X,LIAO Y,LIO P,et al.Privacy-preserving asynchronous federated learning mechanism for edge network computing[J].IEEE Access,2020,8(99):48970-48981.
[70]LIU W,CHEN L,CHEN Y,et al.Accelerating federated lear-ning via momentum gradient descent[J].IEEE Transactions on Parallel and Distributed Systems,2020,31(8):1754-1766.
[71]XIAO P,CHENG S,STANKOVIC V,et al.Averaging is probably not the optimum way of aggregating parameters in federated learning[J].Entropy,2020,22(3):314.
[72]LI T,SAHU A K,TALWALKAR A,et al.Federated learning:Challenges,methods,and future directions[J].IEEE Signal Processing Magazine,2020,37(3):50-60.
[73]REISIZADEH A,TAHERI H,MOKHTARI A,et al.Robust and communication-efficient collaborative learning[J/OL].Advances in Neural Information Processing Systems,2019,32.https://doi.org/10.48550/arXiv.1907.10595.
[74]LEE S,ZHENG X,HUA J,et al.Opportunistic FederatedLearning:An Exploration of Egocentric Collaboration for Pervasive Computing Applications[C]//2021 IEEE International Conference on Pervasive Computing and Communications(PerCom).IEEE,2021:1-8.
[75]AMIRI M M,GUNDUZ D,KULKARNI S R,et al.Convergence of federated learning over a noisy downlink [J].IEEE Transactions on Wireless Communications,2021,21(3):1422-1437.
[76]BRAVERMAN M,GARG A,MA T,et al.Communication lo-wer bounds for statistical estimation problems via a distributed data processing inequality[C]//Proceeding of the 48th Annual ACM Symposium on Theory of Computing.2016:1011-1020.
[77]HAN Y,OZGUR A,WEISSMAN T.Geometric lower bounds for distributed parameter estimation under communication constraints[C]//Conference On Learning Theory.PMLR,2018:3163-3188.
[78]ACHARYA J,CANONNE C L,TYAGI H.Inference under information constraints I:Lower bounds from chi-square contraction[J].IEEE Transactions on Information Theory,2020,66(12):7835-7855.
[79]BARNES L P,HAN Y,OZGUR A.Lower bounds for learning distributions under communication constraints via fisher information[J].Journal of Machine Learning Research,2020,21(236):1-30.
[80]KONECNY J,RICHTARIK P.Randomized distributed meanestimation:Accuracy vs.communication[J].Frontiers in Applied Mathematics and Statistics,2018,4:62-73.
[81]YANG H H,LIU Z,QUEK T Q S,et al.Scheduling policies for federated learning in wireless networks[J].IEEE Transactions on Communications,2019,68(1):317-333.
[82]SU L,LAU V K N.Distributed Edge Learning with Inter-Type and Intra-Type Over-the-Air Collaboration[C]//ICC 2021-IEEE International Conference on Communications.IEEE,2021:1-6.
[83]FAN X,WANG Y,HUO Y,et al.Joint optimization of communications and federated learning over the air[J].IEEE Transactions on Wireless Communications,2022,21(6):4434-4449.
[84]CHAI H,LENG S,CHEN Y,et al.A hierarchical blockchain-enabled federated learning algorithm for knowledge sharing in internet of vehicles[J].IEEE Transactions on Intelligent Transportation Systems,2020,22(7):3975-3986.
[85]LIM W Y B,NG J S,XIONG Z,et al.Dynamic Resource Allocation for Hierarchical Federated Learning[C]//2020 16th International Conference on Mobility,Sensing and Networking(MSN).IEEE,2020:153-160.
[86]MAJIDI F,KHAYYAMBASHI M R,BAREKATAIN B.Hfdrl:An intelligent dynamic cooperate cashing method based on hierarchical federated deep reinforcement learning in edge-enabled iot[J].IEEE Internet of Things Journal,2021,9(2):1402-1413.
[87]ANH T T,LUONG N C,NIYATO D,et al.Efficient training management for mobile crowd-machine learning:A deep reinforcement learning approach[J].IEEE Wireless Communications Letters,2019,8(5):1345-1348.
[88]NG J S,LIM W Y B,XIONG Z,et al.A Hierarchical Incentive Design Toward Motivating Participation in Coded Federated Learning[J].IEEE Journal on Selected Areas in Communications,2021,40(1):359-375.
[89]WANG X,GARG S,LIN H,et al.Towards accurate anomaly detection in industrial internet-of-things using hierarchical federated learning[J].IEEE Internet of Things Journal,2021,9(10):7110-7119.
[1] CHU Yu-chun, GONG Hang, Wang Xue-fang, LIU Pei-shun. Study on Knowledge Distillation of Target Detection Algorithm Based on YOLOv4 [J]. Computer Science, 2022, 49(6A): 337-344.
[2] CHENG Xiang-ming, DENG Chun-hua. Compression Algorithm of Face Recognition Model Based on Unlabeled Knowledge Distillation [J]. Computer Science, 2022, 49(6): 245-253.
[3] DU Hui, LI Zhuo, CHEN Xin. Incentive Mechanism for Hierarchical Federated Learning Based on Online Double Auction [J]. Computer Science, 2022, 49(3): 23-30.
[4] HUANG Yu-jiao, ZHAN Li-chao, FAN Xing-gang, XIAO Jie, LONG Hai-xia. Text Classification Based on Knowledge Distillation Model ELECTRA-base-BiLSTM [J]. Computer Science, 2022, 49(11A): 211200181-6.
[5] DENG Peng-fei, GUAN Zheng, WANG Yu-yang, WANG Xue. Identification Method of Maize Disease Based on Transfer Learning and Model Compression [J]. Computer Science, 2022, 49(11A): 211200009-6.
[6] HUANG Zhong-hao, YANG Xing-yao, YU Jiong, GUO Liang, LI Xiang. Mutual Learning Knowledge Distillation Based on Multi-stage Multi-generative Adversarial Network [J]. Computer Science, 2022, 49(10): 169-175.
[7] CHEN Zhi-wen, WANG Kun, ZHOU Guang-yun, WANG Xu, ZHANG Xiao-dan, ZHU Hu-ming. SAR Image Change Detection Method Based on Capsule Network with Weight Pruning [J]. Computer Science, 2021, 48(7): 190-198.
[8] ZHOU Zhi-yi, SHONG Bing, DUAN Peng-song, CAO Yang-jie. LWID:Lightweight Gait Recognition Model Based on WiFi Signals [J]. Computer Science, 2020, 47(11): 25-31.
[9] WANG Yan, HAN Xiao, ZENG Hui, LIU Jing-xin, XIA Chang-qing. Task Migration Node Selection with Reliable Service Quality in Edge Computing Environment [J]. Computer Science, 2020, 47(10): 240-246.
[10] LI Qing-hua, LI Cui-ping, ZHANG Jing, CHEN Hong, WANG Shao-qing. Survey of Compressed Deep Neural Network [J]. Computer Science, 2019, 46(9): 1-14.
[11] ZENG Yan, CHEN Yue-lin, CAI Xiao-dong. Deep Face Recognition Algorithm Based on Weighted Hashing [J]. Computer Science, 2019, 46(6): 277-281.
[12] LI Yan and WANG Li-na. Node Selection Strategy Based on Intuitionistic Fuzzy Sets in P2P Streaming Media System [J]. Computer Science, 2013, 40(Z6): 280-282.
[13] ZHAO Chong-chong, WANG Xin, HU Chang-jun. Optimization Technology of Domain-special Data Transmission Based on XML [J]. Computer Science, 2009, 36(8): 185-188.
[14] GUO Jin-Huai, ZHAO Xiong-Wei, XU Xiao-Jian ,YU Hong-Yi (Information Engineering University Information Engineering Institute, Zhengzhou 450002). [J]. Computer Science, 2007, 34(6): 64-67.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!