Computer Science ›› 2022, Vol. 49 ›› Issue (12): 40-45.doi: 10.11896/jsjkx.220600237

• Federated Leaming • Previous Articles     Next Articles

Efficient Federated Learning Scheme Based on Background Optimization

GUO Gui-juan1, TIAN Hui1, WANG Tian2,3, JIA Wei-jia2,3   

  1. 1 College of Computer Science and Technology,Huaqiao University,Xiamen,Fujian 361021,China
    2 Institute of Artificial Intelligence and Future Networks,Beijing Normal University,Zhuhai,Guangdong 519000,China
    3 Guangdong Key Lab of AI and Multi-Modal Data Processing,BNU-HKBU United International College,Zhuhai,Guangdong 519000,China
  • Received:2022-06-24 Revised:2022-08-16 Published:2022-12-14
  • About author:GUO Gui-juan,born in 1994,postgra-duate.Her main research interests include edge intelligence and federated learning.WANG Tian,born in 1982,Ph.D,professor,Ph.D supervisor,is a senior member of China Computer Federation.His main research interests include edge computing and Internet of things.
  • Supported by:
    National Key R & D Plan of the Ministry of Science and Technology(2022YFE0201400),National Natural Science Foundation of China(62172046),Outstanding Youth Program of Fujian Natural Science Foundation(2020J06023),Key Special Projects of Guangdong Provincial Department of Education(2021ZDZX1063),Zhuhai Industry University Research Project(ZH22017001210133PWC),Artificial Intelligence and Multimodal Key Laboratory Project of Guangdong Provincial Department of Education(2020KSYS007) and UIC Scientific Research Startup Fund(R72021202).

Abstract: Federated learning can effectively ensure the privacy and security of data because it trains data locally on the client.The study of federal learning has made great progress.However,due to the existence of non-independent and identically distributed data,unbalanced data amount and data type,the client will inevitably have problems such as lack of accuracy and low training efficiency when using local data for training.In order to deal with the problem that the federal learning efficiency is reduced due to the difference of the federal learning background,this paper proposes an efficient federated learning scheme based on background optimization to improve the accuracy of the local model in the terminal device,so as to reduce the communication cost and improve the training efficiency of the whole model.Specifically,the first device and the second device are selected according to the diffe-rence in accuracy in different environments,and the irrelevance between the first device model and the global model (hereafter we collectively refer to as the difference value) is taken as the standard difference value.Whether the second device uploads the local model is determined by the value of the difference between the second device and the first device.Experimental results show that compared with the traditional federated learning,the proposed scheme performs better than the federated average algorithm in common federated learning scenarios,and improves the accuracy by about 7.5% in the MINIST data sets.In the CIFAR-10 data set,accuracy improves by about 10%.

Key words: Federated learning, Background optimization, Equipment classification, Irrelevance, Difference value

CLC Number: 

  • TP393.0
[1]WANG T,CAO Z,WANG S,et al.Privacy-enhanced data collection based on deep learning for Internet of vehicles[J].IEEE Transactions on Industrial Informatics,2019,16(10):6663-6672.
[2]YANG Q,LIU Y,CHEN T,et al.Federated machine learning:Concept and applications[J].ACM Transactions on Intelligent Systems and Technology(TIST),2019,10(2):1-19.
[3]ZHANG Y L,LIANG Y Z,YIN M J,et al.A Review of Offloading Schemes in moving Edge Computing[J].Chinese Journal of Computers,2021,44(12):2406-2430.
[4]SATTLER F,WIEDEMANN S,MÜLLER K R,et al.Robust and communication-efficient federated learning from non-iid data[J].IEEE Transactions on Neural Networks and Learning Systems,2019,31(9):3400-3413.
[5]WANG T,BHUIYAN M Z A,WANG G,et al.Preserving ba-lance between privacy and data integrity in edge-assisted Internet of Things[J].IEEE Internet of Things Journal,2019,7(4):2679-2689.
[6]BRIGGS C,FAN Z,ANDRAS P.Federated learning with hie-rarchical clustering of local updates to improve training on non-IID data[C]//2020 International Joint Conference on Neural Networks(IJCNN).IEEE,2020:1-9.
[7]ZHAO Y,LI M,LAI L,et al.Federated learning with non-iid data[J].arXiv:1806.00582,2018.
[8]DUAN M,LIU D,CHEN X,et al.Self-balancing federatedlearning with global imbalanced data in mobile systems[J].IEEE Transactions on Parallel and DistributedSystems,2020,32(1):59-71.
[9]HUANG Y,CHU L,ZHOU Z,et al.Personalized cross-silo fe-derated learning on non-iid data[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2021:7865-7873.
[10]ZHANG L,LUO Y,BAI Y,et al.Federated Learning for Non-IID Data via Unified Feature Learning and Optimization Objective Alignment[C]//Proceedings of the IEEE/CVF Inter-national Conference on Computer Vision.2021:4420-4428.
[11]SUN Y,ZHOU S,GÜNDÜZ D.Energy-aware analog aggregation for federated learning with redundant data[C]//ICC 2020-2020 IEEE International Conference on Communications(ICC).IEEE,2020:1-7.
[12]BONAWITZ K,EICHNER H,GRIESKAMP W,et al.Towards federated learning at scale:System design[J].arXiv:1902.01046,2019.
[13]SHI D,LI L,CHEN R,et al.Towards Energy Efficient Federated Learning over 5G+ Mobile Devices [J].IEEE Signal Processing Magazine,2020,37(3):50-60.
[14]WANG L,WANG W,LI B.CMFL:Mitigating communication overhead for federated learning[C]//Proceedings of the IEEE 39th International Conference on Distributed Computing Systems.Texas,USA,2019:954-964.
[15]WANG T,WANG P,CAI S,et al.Mobile edge-enabled trustevaluation for the Internet of Things[J].Information Fusion,2021,75(4):90-100.
[16]SUN B,LIU Y,WANG T,et al.Review offederated learning Efficiency Optimization in Mobile Edge Networks[EB/OL].http://kns.cnki.net/kcms/detail/11.1777.TP.20211104.2019.024.html.2022-4-16.
[17]ZHANG C,XIE Y,BAI H,et al.A survey on federated learning[J].Knowledge-Based Systems,2021,216(15):1-11.
[18]KAIROUZ P,MCMAHAN H B,AVENT B,et al.Advancesand open problems in federated learning[J].arXiv:1912.04977,2019.
[19]WANG T,LIU Y,ZHENG X,et al.Edge-Based Communication Optimization for Distributed Federated Learning[J].IEEE Transactions on Network Science and Engineering,2022,9(4): 2015-2024.
[20]LIU Y,WANG T,PENG S L,et al.Edge-based model cleaning and device clustering in federated learning[J].Chinese Journal of Computers,2021,44(12):2517-2530.
[21]ABADI M,BARHAM P,CHEN J,et al.Tensorflow:A system for large-scale machine learning[C]//12th USENIX Symposium on Operating Systems Design and Implementation(OSDI 16).2016:265-283.
[1] TANG Ling-tao, WANG Di, ZHANG Lu-fei, LIU Sheng-yun. Federated Learning Scheme Based on Secure Multi-party Computation and Differential Privacy [J]. Computer Science, 2022, 49(9): 297-305.
[2] LU Chen-yang, DENG Su, MA Wu-bin, WU Ya-hui, ZHOU Hao-hao. Federated Learning Based on Stratified Sampling Optimization for Heterogeneous Clients [J]. Computer Science, 2022, 49(9): 183-193.
[3] CHEN Ming-xin, ZHANG Jun-bo, LI Tian-rui. Survey on Attacks and Defenses in Federated Learning [J]. Computer Science, 2022, 49(7): 310-323.
[4] LU Chen-yang, DENG Su, MA Wu-bin, WU Ya-hui, ZHOU Hao-hao. Clustered Federated Learning Methods Based on DBSCAN Clustering [J]. Computer Science, 2022, 49(6A): 232-237.
[5] YAN Meng, LIN Ying, NIE Zhi-shen, CAO Yi-fan, PI Huan, ZHANG Lan. Training Method to Improve Robustness of Federated Learning [J]. Computer Science, 2022, 49(6A): 496-501.
[6] DU Hui, LI Zhuo, CHEN Xin. Incentive Mechanism for Hierarchical Federated Learning Based on Online Double Auction [J]. Computer Science, 2022, 49(3): 23-30.
[7] WANG Xin, ZHOU Ze-bao, YU Yun, CHEN Yu-xu, REN Hao-wen, JIANG Yi-bo, SUN Ling-yun. Reliable Incentive Mechanism for Federated Learning of Electric Metering Data [J]. Computer Science, 2022, 49(3): 31-38.
[8] ZHAO Luo-cheng, QU Zhi-hao, XIE Zai-peng. Study on Communication Optimization of Federated Learning in Multi-layer Wireless Edge Environment [J]. Computer Science, 2022, 49(3): 39-45.
[9] ZOU Sai-lan, LI Zhuo, CHEN Xin. Study on Transmission Optimization for Hierarchical Federated Learning [J]. Computer Science, 2022, 49(12): 5-16.
[10] SHEN Zhen, ZHAO Cheng-gui. Storage Task Allocation Algorithm in Decentralized Cloud Storage Network [J]. Computer Science, 2022, 49(12): 17-21.
[11] YANG Hong-jian, HU Xue-xian, LI Ke-jia, XU Yang, WEI Jiang-hong. Study on Privacy-preserving Nonlinear Federated Support Vector Machines [J]. Computer Science, 2022, 49(12): 22-32.
[12] QU Xiang-mou, WU Ying-bo, JIANG Xiao-ling. Federated Data Augmentation Algorithm for Non-independent and Identical Distributed Data [J]. Computer Science, 2022, 49(12): 33-39.
[13] LIANG Wen-ya, LIU Bo, LIN Wei-wei, YAN Yuan-chao. Survey of Incentive Mechanism for Federated Learning [J]. Computer Science, 2022, 49(12): 46-52.
[14] WU Yun-han, BAI Guang-wei, SHEN Hang. Multi-dimensional Resource Dynamic Allocation Algorithm for Internet of Vehicles Based on Federated Learning [J]. Computer Science, 2022, 49(12): 59-65.
[15] GUO Yan-qing, LI Yu-hang, WANG Wan-wan, FU Hai-yan, WU Ming-kan, LI Yi. FL-GRM:Gamma Regression Algorithm Based on Federated Learning [J]. Computer Science, 2022, 49(12): 66-73.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!