Computer Science ›› 2025, Vol. 52 ›› Issue (3): 152-160.doi: 10.11896/jsjkx.240600014

• Database & Big Data & Data Science • Previous Articles     Next Articles

Federated Learning Evolutionary Multi-objective Optimization Algorithm Based on Improved NSGA-III

HU Kangqi, MA Wubin, DAI Chaofan, WU Yahui, ZHOU Haohao   

  1. National Key Laboratory of Information Systems Engineering,National University of Defense Technology,Changsha 410073,China
  • Received:2024-06-03 Revised:2024-08-27 Online:2025-03-15 Published:2025-03-07
  • About author:HU Kangqi,born in 2000,postgraduate.His main research interests include fe-derated learning and multi-objective optimization.
    MA Wubin,born in 1986,Ph.D,asso-ciate researcher.His main research in-terests include multi-objective optimization,micro service and data mining.
  • Supported by:
    National Natural Science Foundation of China(61871388).

Abstract: Federated learning is a novel distributed machine learning method that can train models without sharing raw data.Current federated learning methods suffer from the problem that it is difficult to optimize multiple objectives simultaneously,such as optimizing the model accuracy,optimizing communication costs and balancing the distribution of participants’ performance.A four-objective optimization model and a solution algorithm for federated learning are proposed to address this problem.Data usage cost,global model error rate,model accuracy distribution variance and communication cost are taken as the optimization objectives to construct the optimization model.Aiming at the problem that the solution space of this model is large and the traditional NSGA-III algorithm is difficult to find the optimal solution,the improved NSGA-III federated learning multi-objective optimization algorithm GPNSGA-III based on the Good Point Set initialization strategy is proposed to find the Pareto optimal solution.The algorithm uniformly distributes the limited initialization populations in the objective solution space through the good point set initialization strategy,so that the first-generation solution is maximally close to the optimal value and the ability to find the optimal value is improved compared with the original algorithm.Experimental results show that the hypervolume value of the Pareto solution obtained by the GPNSGA-III algorithm is improved by 107% on average compared with the NSGA-III algorithm;the Spacing value is reduced by 32.3% on average compared with the NSGA-III algorithm;compared with the other multi-objective optimization algorithms,the GPNSGA-III algorithm is more effective in achieving the accuracy of the model while guaranteeing the ba-lance of model accuracy distribution variance,communication cost and data cost.

Key words: Federated learning, Multi-objective optimization, Good point set, NSGA-III algorithm, Parameters optimization

CLC Number: 

  • TP301
[1]LI S B,YANG L,LI C J,et al.Overview of federated learning:Technology,applications and future[J].Computer Integrated Manufacturing Systems,2022,28(7):2119.
[2]MCMAHAN B,MOORE E,RAMAGE D,et al.Communica-tion-efficient learning of deep networks from decentralized data[C]//Artificial Intelligence and Statistics.PMLR,2017:1273-1282.
[3]KONEČNÝ J,MCMAHAN H B,YU F X,et al.Federated learning:Strategies for improving communication efficiency[J].arXiv:1610.05492,2016.
[4]DEB K,JAIN H.An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach,part I:solving problems with box constraints[J].IEEE Transactions on Evolutionary Computation,2013,18(4):577-601.
[5]LI T,SAHU A K,ZAHEER M,et al.Federated optimization in heterogeneous networks[J].Proceedings of Machine Learning and Systems,2020,2:429-450.
[6]WANG J,WANG H L,HUANG B W,et al.Intrusion detection for industrial internet of things based on federated learning and self-attention[J].Journal of Jilin University(Engineering and Technology Edition),2023,53(11):3229-3237.
[7]BIBIKAR S,VIKALO H,WANG Z,et al.Federated dynamicsparse training:Computing less,communicating less,yet lear-ning better[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2022:6080-6088.
[8]ABAY A,ZHOU Y,BARACALDO N,et al.Mitigating bias in federated learning[J].arXiv:2012.02447,2020.
[9]CUI S,PAN W,LIANG J,et al.Addressing algorithmic disparity and performance inconsistency in federated learning[J].Advances in Neural Information Processing Systems,2021,34:26091-26102.
[10]CHAI Z,YANG C,LI Y.Communication efficiency optimization in federated learning based on multi-objective evolutionary algorithm[J].Evolutionary Intelligence,2023,16(3):1033-1044.
[11]ZHU H,JIN Y.Multi-objective evolutionary federated learning[J].IEEE Transactions on Neural Networks and Learning Systems,2019,31(4):1310-1322.
[12]DEB K,PRATAP A,AGARWAL S,et al.A fast and elitist multi-objective genetic algorithm:NSGA-II[J].IEEE Transactions on Evolutionary Computation,2002,6(2):182-197.
[13]HUA L G,WANG Y.The application of number theory in approximate analysis[M].Beijing:Science Press,1978:83-99.
[14]ZHU F,LI G,TANG H,et al.Dung beetle optimization algorithm based on quantum computing and multi-strategy fusion for solving engineering problems[J].Expert Systems with Applications,2024,236:121219.
[15]GU Z M,WANG G G.Improving NSGA-III algorithms with information feedback models for large-scale many-objective optimization[J].Future Generation Computer Systems,2020,107:49-69.
[16]CUI Z,CHANG Y,ZHANG J,et al.Improved NSGA-III with selection-and-elimination operator[J].Swarm and Evolutionary Computation,2019,49:23-33.
[17]LECUN Y,CORTES C,BURGES C J.MNIST handwrittendigit database[J/OL].http://yann.lecun.com/exdb/mnist.
[18]KRIZHEVSKY A,HINTON G.Learning multiple layers of features from tiny images [J/OL].http://www.cs.toronto.edu/~kriz/ci-far.html.
[19]ZITZLER E,THIELE L.Multiobjective evolutionary algo-rithms:a comparative case study and the strength Pareto approach[J].IEEE Transactions on Evolutionary Computation,1999,3(4):257-271.
[20]ZHANG Q,LI H.MOEA/D:A multiobjective evolutionary algorithm based on decomposition[J].IEEE Transactions on Evolutionary Computation,2007,11(6):712-731.
[21]ZITZLER E,LAUMANNS M,THIELE L.SPEA2:improving the strength Pareto evolutionary algorithm[R/OL].TIK-report,2001,103:(2001-05).https://doi.org/10.3929/ethz-a-004284029.
[22]ZITZLER E,KÜNZLI S.Indicator-based selection in multi-objective search[C]//InternationalConference on Parallel Problem Solving from Nature.Berlin,Heidelberg:Springer Berlin Heidelberg,2004:832-842.
[23]KNOWLES J,CORNE D.The pareto archived evolution strategy:A new baseline algorithm for pareto multi-objective optimization[C]//Proceedings of the 1999 Congress on Evolutionary Computation(CEC99).IEEE,1999:98-105.
[1] LUO Zhengquan, WANG Yunlong, WANG Zilei, SUN Zhenan, ZHANG Kunbo. Study on Active Privacy Protection Method in Metaverse Gaze Communication Based on SplitFederated Learning [J]. Computer Science, 2025, 52(3): 95-103.
[2] WANG Ruicong, BIAN Naizheng, WU Yingjun. FedRCD:A Clustering Federated Learning Algorithm Based on Distribution Extraction andCommunity Detection [J]. Computer Science, 2025, 52(3): 188-196.
[3] WANG Dongzhi, LIU Yan, GUO Bin, YU Zhiwen. Edge-side Federated Continuous Learning Method Based on Brain-like Spiking Neural Networks [J]. Computer Science, 2025, 52(3): 326-337.
[4] XIE Jiachen, LIU Bo, LIN Weiwei , ZHENG Jianwen. Survey of Federated Incremental Learning [J]. Computer Science, 2025, 52(3): 377-384.
[5] ZHENG Jianwen, LIU Bo, LIN Weiwei, XIE Jiachen. Survey of Communication Efficiency for Federated Learning [J]. Computer Science, 2025, 52(2): 1-7.
[6] WANG Xin, XIONG Shubo, SUN Lingyun. Federated Graph Learning:Problems,Methods and Challenges [J]. Computer Science, 2025, 52(1): 362-373.
[7] DUN Jingbo, LI Zhuo. Survey on Transmission Optimization Technologies for Federated Large Language Model Training [J]. Computer Science, 2025, 52(1): 42-55.
[8] LIU Yuming, DAI Yu, CHEN Gongping. Review of Federated Learning in Medical Image Processing [J]. Computer Science, 2025, 52(1): 183-193.
[9] ZHAO Chenyang, LIU Lei, JIANG He. Feature Construction for Effort-aware Just-In-Time Software Defect Prediction Based on Multi-objective Optimization [J]. Computer Science, 2025, 52(1): 232-241.
[10] ZHOU Yu, YANG Junling, DANG Kelin. Change Detection in SAR Images Based on Evolutionary Multi-objective Clustering [J]. Computer Science, 2024, 51(9): 140-146.
[11] LI Zhi, LIN Sen, ZHANG Qiang. Edge Cloud Computing Approach for Intelligent Fault Detection in Rail Transit [J]. Computer Science, 2024, 51(9): 331-337.
[12] ZHOU Tianyang, YANG Lei. Study on Client Selection Strategy and Dataset Partition in Federated Learning Basedon Edge TB [J]. Computer Science, 2024, 51(6A): 230800046-6.
[13] HAN Lijun, WANG Peng, LI Ruixu, LIU Zhongyao. Dual Direction Vectors-based Large-scale Multi-objective Evolutionary Algorithm [J]. Computer Science, 2024, 51(6A): 230700155-11.
[14] SUN Min, DING Xining, CHENG Qian. Federated Learning Scheme Based on Differential Privacy [J]. Computer Science, 2024, 51(6A): 230600211-6.
[15] TAN Zhiwen, XU Ruzhi, WANG Naiyu, LUO Dan. Differential Privacy Federated Learning Method Based on Knowledge Distillation [J]. Computer Science, 2024, 51(6A): 230600002-8.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!