计算机科学 ›› 2025, Vol. 52 ›› Issue (3): 377-384.doi: 10.11896/jsjkx.240300035

• 信息安全 • 上一篇    下一篇

联邦增量学习研究综述

谢家晨1, 刘波1, 林伟伟2, 郑剑文1   

  1. 1 华南师范大学计算机学院 广州 510631
    2 华南理工大学计算机科学与工程学院 广州 510640
  • 收稿日期:2024-03-05 修回日期:2024-09-03 出版日期:2025-03-15 发布日期:2025-03-07
  • 通讯作者: 林伟伟(linww@scut.edu.cn)
  • 作者简介:(854449273@qq.com)
  • 基金资助:
    国家自然科学基金(620721878);广州开发区科技项目(2021GH10)

Survey of Federated Incremental Learning

XIE Jiachen1, LIU Bo1, LIN Weiwei2 , ZHENG Jianwen1   

  1. 1 School of Computer Science,South China Normal University,Guangzhou 510631,China
    2 School of Computer Science and Engineering,South China University of Technology,Guangzhou 510640,China
  • Received:2024-03-05 Revised:2024-09-03 Online:2025-03-15 Published:2025-03-07
  • About author:XIE Jiachen,born in 1999,postgra-duate.His main research interest is federated learning.
    LIN Weiwei,born in 1980,Ph.D,professor,is a member of CCF(No.37313D).His main research interestes include cloud computing,big data technology and AI application technology.
  • Supported by:
    National Natural Science Foundation of China(620721878) and Guangzhou Development Zone Science and Technology Project(2021GH10).

摘要: 联邦学习以其独特的分布式训练模式和安全聚合机制成为近年来的研究热点。然而,在现实生活中,本地模型训练往往会收集到新数据,从而造成对旧数据的灾难性遗忘。因此,如何将联邦学习和增量学习有效结合是实现联邦生态可持续发展的关键所在。文中首先对联邦增量学习(Federated Incremental Learning)的相关概念进行了深入的调查和分析;然后,重点阐述了基于数据、基于模型、基于架构和基于多方面联合优化的联邦增量学习方法,同时对现有方法进行分类和比较;最后,分析和总结了联邦增量学习未来的研究发展方向,如大规模、小样本、安全可靠和多任务场景下的联邦增量学习。

关键词: 联邦学习, 安全聚合, 灾难性遗忘, 可持续发展, 联邦增量学习

Abstract: Federated learning,with its unique distributed training mode and secure aggregation mechanism,has become a research hotspot in recent years.However,in real-life scenarios,local model training often faces new data,leading to catastrophic forgetting of old data.Therefore,effectively integrating federated learning with incremental learning is crucial for achieving sustainable development of federated ecosystems.This paper first conducts an in-depth investigation and analysis of federated incremental learning,exploring its concepts.Subsequently,it elaborates on federated incremental learning methods based on data,model,architecture,and multi-aspect joint optimization,while also categorizing and comparing various existing methods.Finally,building upon this foundation,it analyzes and summarizes future research directions for federated incremental learning,such as scalability,small sample handling,security,reliability,and multi-task scenarios.

Key words: Federated learning, Secure aggregation, Catastrophic forgetting, Sustainable development, Federated incremental learning

中图分类号: 

  • TP181
[1]WANG C X,YOU X,GAO X,et al.On the road to 6G:Visions,requirements,key technologies and testbeds[J].IEEE Communications Surveys & Tutorials,2023,25(2):905-974.
[2]LIM W Y B,LUONG N C,HOANG D T,et al.Federated lear-ning in mobile edge networks:A comprehensive survey[J].IEEE Communications Surveys & Tutorials,2020,22(3):2031-2063.
[3]MEHMOOD A,NATGUNANATHAN I,XIANG Y,et al.Protection of big data privacy[J].IEEE Access,2016,4:1821-1834.
[4]MCMAHAN B,MOORE E,RAMAGE D,et al.Communication-efficient learning of deep networks from decentralized data[C]//Artificial Intelligence and Statistics.PMLR,2017:1273-1282.
[5]WU X,LIANG Z,WANG J.Fedmed:A federated learningframework for language modeling[J].Sensors,2020,20(14):4048-4065.
[6]ZHENG Z,ZHOU Y,SUN Y,et al.Applications of federated learning in smart cities:recent advances,taxonomy,and open challenges[J].Connection Science,2022,34(1):1-28.
[7]LIU B,LV N,GUO Y,et al.Recent Advances on Federated Learning:A Systematic Survey[EB/OL].(2023-01-03)[2024-04-28].https://arxiv.org/abs/2301.01299.
[8]WU L,GUO S,WANG J,et al.On Knowledge Editing in Fede-rated Learning:Perspectives,Challenges,and Future Directions[EB/OL].(2023-06-02)[2024-04-28].https://arxiv.org/abs/2306.01431.
[9]HAN Y N,LIU J W,LUO X L.Progress in continuous learning research[J].Computer Research and Development,2022,59(6):1213-1239.
[10]HUANG N,LI D D,YAO J,et al.Decentralized federated incremental learning method combined with meta-learning[J].Computer Science,2024,51(3):271-279.
[11]PARK T J,KUMATANI K,DIMITRIADIS D.Tackling dy-namics in federated incremental learning with variational embedding rehearsal[J].arXiv:2110.09695,2021.
[12]AGGARWAL A,MITTAL M,BATTINENI G.Generative adversarial network:An overview of theory and applications[J].International Journal of Information Management Data Insights,2021,1(1):100004-100013.
[13]QI D,ZHAO H,LI S.Better generative replay for continual fe-derated learning[J].arXiv:2302.13001,2023.
[14]BABAKNIYA S,FABIAN Z,HE C,et al.A Data-Free Ap-proach to Mitigate Catastrophic Forgetting in Federated Class Incremental Learning for Vision Tasks[J].Advances in Neural Information Processing Systems,2024,36:1-18.
[15]PENNISI M,SALANITRI F P,BELLITTO G,et al.Experience Replay as an Effective Strategy f-or Optimizing Decentralized Federated Learning[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision.2023:3376-3383.
[16]MORADI R,BERANGI R,MINAEI B.A survey of regularization strategies for deep models[J].Artificial Intelligence Review,2020,53:3947-3986.
[17]KIRKPATRICK J,PASCANU R,RABINOWITZ N,et al.Overcoming catastrophic forgetting in neural networks[J].Proceedings of the National Academy Sf sciences,2017,114(13):3521-3526.
[18]SONG H,LI S Q,WAN F J,et al.Federated learning optimization method in non-independent and identically distributed scenarios [J].Computer Engineering,2024,50(3):166-172.
[19]SHENAJ D,TOLDO M,RIGON A,et al.Asynchronous Fede-rated Continual Learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2023:5054-5062.
[20]YOON J,JEONG W,LEE G,et al.Federated continual learning with weighted inter-client transfer[C]//International Confe-rence on Machine Learning.PMLR,2021:12073-12086.
[21]JIANG Z,REN Y,LEI M,et al.Fedspeech:Federated text-to-speech with continual learning[J].arXiv:2110.07216,2021.
[22]MUN H,LEE Y.Internet traffic classification with federatedlearning[J].Electronics,2020,10(1):27.
[23]JIANG H,He T L,LIU M,et al.High-performance federated continuous learning algorithm for heterogeneous streaming data [J].Journal of Communications,2023,44(5):123-136.
[24]MA Y,XIE Z,WANG J,et al.Continual federated learningbased on knowledge distillation[C]//Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence.2022:2182-2188.
[25]HUANG W,YE M,DU B.Learn from others and be yourself in heterogeneous federated learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2022:10143-10153.
[26]CASADO F E,LEMA D,IGLESIAS R,et al.Ensemble and continual federated learning for classification tasks[J].Machine Learning,2023,112:3413-3453.
[27]HU K,LU M,LI Y,et al.A Federated Incremental LearningAlgorithm Based on Dual Attention Mechanism[J].Applied Sciences,2022,12(19):10025-10044.
[28]DONG J,LIANG W,CONG Y,et al.Heterogeneous forgetting compensation for class-incremental learning[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision.2023:11742-11751.
[29]DONG J,ZHANG D,CONG Y,et al.Federated Incremental Semantic Segmentation[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2023:3934-3943.
[30]DONG J,WANG L,FANG Z,et al.Federated class-incremental learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2022:10164-10173.
[31]DONG J,LI H,CONG Y,et al.No one left behind:Real-world federated class-incremental learning[J].arXiv:2302.00903,2023.
[32]LIU J,ZHAN Y W,ZHANG C Y,et al. Federated Class-Incremental Learning with Prompting[J].arXiv:2310.08948,2023.
[33]LIU C,QU X,WANG J,et al.Fedet:a communication-efficient federated class-incremental learning framework based on enhanced transformer[J].arXiv:2306.15347,2023.
[34]HALBE S,SMITH J S,TIAN J,et al.HePCo:Data-Free Hete-rogeneous Prompt Consolidation for Continu-al Federated Lear-ning[EB/OL].(2023-06-16)[2024-04-28].https://arxiv.org/abs/2306.09970.
[35]HOSPEDALES T,ANTONIOU A,MICAELLI P,et al.Meta-learning in neural networks:A survey[J].IEEE transactions on pattern analysis and machine intelligence,2021,44(9):5149-5169.
[36]WEISS K,KHOSHGOFTAAR T M,WANG D D.A survey of transfer learning[J].Journal of Big data,2016,3(1):1-40.
[37]JAISWAL A,BABU A R,ZADEH M Z,et al.A survey on con-trastive self-supervised learning[J].Technologies,2020,9(1):2-24.
[38]DING J,TRAMEL E,SAHU A K,et al.Federated learningchallenges and opportunities:An outlook[C]//ICASSP 2022-2022 IEEE International Conference on Acoustics,Speech and Signal Processing(ICASSP).IEEE,2022:8752-8756.
[39]LUO C Y,CHEN X B,MA C D,et al.Online federated incremental learning algorithm for blockchain [J].Computer Applications,2021,41(2):363-371.
[40]DWORK C.Differential privacy:A survey of results[C]//International Conference on Theory and Applications of Models of Computation.Berlin,Heidelberg:Springer Berlin Heidelberg,2008:1-19.
[41]ACAR A,AKSU H,ULUAGAC A S,et al.A survey on homomorphic encryption schemes:Theory and implementation[J].ACM Computing Surveys,2018,51(4):1-35.
[42]ZHAO C,ZHAO S,ZHAO M,et al.Secure multi-party computation:theory,practice and applications[J].Information Sciences,2019,476:357-372.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!