计算机科学 ›› 2025, Vol. 52 ›› Issue (3): 152-160.doi: 10.11896/jsjkx.240600014

• 数据库&大数据&数据科学 • 上一篇    下一篇

一种基于改进NSGA-III的联邦学习进化多目标优化算法

胡康琦, 马武彬, 戴超凡, 吴亚辉, 周浩浩   

  1. 国防科技大学信息系统工程全国重点实验室 长沙 410073
  • 收稿日期:2024-06-03 修回日期:2024-08-27 出版日期:2025-03-15 发布日期:2025-03-07
  • 通讯作者: 马武彬(wb_ma@nudt.edu.cn)
  • 作者简介:(hukangqi66@nudt.edu.cn)
  • 基金资助:
    国家自然科学基金面上项目(61871388)

Federated Learning Evolutionary Multi-objective Optimization Algorithm Based on Improved NSGA-III

HU Kangqi, MA Wubin, DAI Chaofan, WU Yahui, ZHOU Haohao   

  1. National Key Laboratory of Information Systems Engineering,National University of Defense Technology,Changsha 410073,China
  • Received:2024-06-03 Revised:2024-08-27 Online:2025-03-15 Published:2025-03-07
  • About author:HU Kangqi,born in 2000,postgraduate.His main research interests include fe-derated learning and multi-objective optimization.
    MA Wubin,born in 1986,Ph.D,asso-ciate researcher.His main research in-terests include multi-objective optimization,micro service and data mining.
  • Supported by:
    National Natural Science Foundation of China(61871388).

摘要: 联邦学习是一种新型的分布式机器学习方法,可以在不共享原始数据的前提下训练模型。当前,联邦学习方法存在针对模型准确率最优化、通信成本最优化、参与者性能分布均衡等多个目标同时优化难的问题,难以做到多目标的同步均衡。针对该问题,提出联邦学习四目标优化模型及求解算法。将全局模型错误率、模型准确率分布方差、通信成本、数据成本作为优化目标,构建优化模型。同时,针对该模型的求解搜索空间大,传统NSGA-III算法难以寻优的问题,提出基于佳点集初始化策略的改进NSGA-III联邦学习多目标优化算法GPNSGA-III(Good Point Set Initialization NSGA-III),以求取Pareto最优解。该算法通过佳点集初始化策略将有限的初始化种群以均匀的方式分布在目标求解空间中,相较于原始算法,使第一代解最大限度地接近最优值,提升寻优能力。实验结果证明,GPNSGA-III算法得到的Pareto解的超体积值相较于NSGA-III算法平均提升107%;Spacing值相较于NSGA-III算法平均下降32.3%;对比其他多目标优化算法,GPNSGA-III算法能在保证模型准确率的情况下,更有效地实现模型分布方差、通信成本和数据成本的均衡。

关键词: 联邦学习, 多目标寻优, 佳点集, NSGA-III算法, 参数优化

Abstract: Federated learning is a novel distributed machine learning method that can train models without sharing raw data.Current federated learning methods suffer from the problem that it is difficult to optimize multiple objectives simultaneously,such as optimizing the model accuracy,optimizing communication costs and balancing the distribution of participants’ performance.A four-objective optimization model and a solution algorithm for federated learning are proposed to address this problem.Data usage cost,global model error rate,model accuracy distribution variance and communication cost are taken as the optimization objectives to construct the optimization model.Aiming at the problem that the solution space of this model is large and the traditional NSGA-III algorithm is difficult to find the optimal solution,the improved NSGA-III federated learning multi-objective optimization algorithm GPNSGA-III based on the Good Point Set initialization strategy is proposed to find the Pareto optimal solution.The algorithm uniformly distributes the limited initialization populations in the objective solution space through the good point set initialization strategy,so that the first-generation solution is maximally close to the optimal value and the ability to find the optimal value is improved compared with the original algorithm.Experimental results show that the hypervolume value of the Pareto solution obtained by the GPNSGA-III algorithm is improved by 107% on average compared with the NSGA-III algorithm;the Spacing value is reduced by 32.3% on average compared with the NSGA-III algorithm;compared with the other multi-objective optimization algorithms,the GPNSGA-III algorithm is more effective in achieving the accuracy of the model while guaranteeing the ba-lance of model accuracy distribution variance,communication cost and data cost.

Key words: Federated learning, Multi-objective optimization, Good point set, NSGA-III algorithm, Parameters optimization

中图分类号: 

  • TP301
[1]LI S B,YANG L,LI C J,et al.Overview of federated learning:Technology,applications and future[J].Computer Integrated Manufacturing Systems,2022,28(7):2119.
[2]MCMAHAN B,MOORE E,RAMAGE D,et al.Communica-tion-efficient learning of deep networks from decentralized data[C]//Artificial Intelligence and Statistics.PMLR,2017:1273-1282.
[3]KONEČNÝ J,MCMAHAN H B,YU F X,et al.Federated learning:Strategies for improving communication efficiency[J].arXiv:1610.05492,2016.
[4]DEB K,JAIN H.An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach,part I:solving problems with box constraints[J].IEEE Transactions on Evolutionary Computation,2013,18(4):577-601.
[5]LI T,SAHU A K,ZAHEER M,et al.Federated optimization in heterogeneous networks[J].Proceedings of Machine Learning and Systems,2020,2:429-450.
[6]WANG J,WANG H L,HUANG B W,et al.Intrusion detection for industrial internet of things based on federated learning and self-attention[J].Journal of Jilin University(Engineering and Technology Edition),2023,53(11):3229-3237.
[7]BIBIKAR S,VIKALO H,WANG Z,et al.Federated dynamicsparse training:Computing less,communicating less,yet lear-ning better[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2022:6080-6088.
[8]ABAY A,ZHOU Y,BARACALDO N,et al.Mitigating bias in federated learning[J].arXiv:2012.02447,2020.
[9]CUI S,PAN W,LIANG J,et al.Addressing algorithmic disparity and performance inconsistency in federated learning[J].Advances in Neural Information Processing Systems,2021,34:26091-26102.
[10]CHAI Z,YANG C,LI Y.Communication efficiency optimization in federated learning based on multi-objective evolutionary algorithm[J].Evolutionary Intelligence,2023,16(3):1033-1044.
[11]ZHU H,JIN Y.Multi-objective evolutionary federated learning[J].IEEE Transactions on Neural Networks and Learning Systems,2019,31(4):1310-1322.
[12]DEB K,PRATAP A,AGARWAL S,et al.A fast and elitist multi-objective genetic algorithm:NSGA-II[J].IEEE Transactions on Evolutionary Computation,2002,6(2):182-197.
[13]HUA L G,WANG Y.The application of number theory in approximate analysis[M].Beijing:Science Press,1978:83-99.
[14]ZHU F,LI G,TANG H,et al.Dung beetle optimization algorithm based on quantum computing and multi-strategy fusion for solving engineering problems[J].Expert Systems with Applications,2024,236:121219.
[15]GU Z M,WANG G G.Improving NSGA-III algorithms with information feedback models for large-scale many-objective optimization[J].Future Generation Computer Systems,2020,107:49-69.
[16]CUI Z,CHANG Y,ZHANG J,et al.Improved NSGA-III with selection-and-elimination operator[J].Swarm and Evolutionary Computation,2019,49:23-33.
[17]LECUN Y,CORTES C,BURGES C J.MNIST handwrittendigit database[J/OL].http://yann.lecun.com/exdb/mnist.
[18]KRIZHEVSKY A,HINTON G.Learning multiple layers of features from tiny images [J/OL].http://www.cs.toronto.edu/~kriz/ci-far.html.
[19]ZITZLER E,THIELE L.Multiobjective evolutionary algo-rithms:a comparative case study and the strength Pareto approach[J].IEEE Transactions on Evolutionary Computation,1999,3(4):257-271.
[20]ZHANG Q,LI H.MOEA/D:A multiobjective evolutionary algorithm based on decomposition[J].IEEE Transactions on Evolutionary Computation,2007,11(6):712-731.
[21]ZITZLER E,LAUMANNS M,THIELE L.SPEA2:improving the strength Pareto evolutionary algorithm[R/OL].TIK-report,2001,103:(2001-05).https://doi.org/10.3929/ethz-a-004284029.
[22]ZITZLER E,KÜNZLI S.Indicator-based selection in multi-objective search[C]//InternationalConference on Parallel Problem Solving from Nature.Berlin,Heidelberg:Springer Berlin Heidelberg,2004:832-842.
[23]KNOWLES J,CORNE D.The pareto archived evolution strategy:A new baseline algorithm for pareto multi-objective optimization[C]//Proceedings of the 1999 Congress on Evolutionary Computation(CEC99).IEEE,1999:98-105.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!