Computer Science ›› 2024, Vol. 51 ›› Issue (3): 271-279.doi: 10.11896/jsjkx.230100125

• Artificial Intelligence • Previous Articles     Next Articles

Decentralized Federated Continual Learning Method Combined with Meta-learning

HUANG Nan, LI Dongdong, YAO Jia, WANG Zhe   

  1. School of Information Science and Engineering,East China University of Science and Technology,Shanghai 200237,China
  • Received:2023-01-30 Revised:2023-06-18 Online:2024-03-15 Published:2024-03-13
  • About author:HUANG Nan,born in 1997,postgra-duate.Her main research interests include federated learning and biometric recognition.LI Dongdong,born in 1981,Ph.D,associate professor,is a member of CCF(No.15173M).Her main research interests include speech processing and emotion computing.
  • Supported by:
    Science and Technology Program of Shanghai(21511100800) and National Natural Science Foundation of China(62276098).

Abstract: For the problems of continual learning and data security in federated continual scenarios,a decentralized federated continual learning framework combined with meta-learning is constructed.First,in order to solve the problem of catastrophic forgetting in incremental scenarios,an incremental meta-learning method based nearest mean-of-exemplars replaying called NMR-cMAML is proposed.Then,to solve the problem of privacy security in federated continual scenarios,a decentralized federated continual framework based on peer-to-peer network architecture is designed,which is different from the center architecture based on server-client.Each client in the decentralized framework adopts NMR-cMAML to learn the continuous tasks incrementally,and the effective knowledge migration between clients is realized by sharing the meta-learning model in the federal communication process.Finally,experiments are conducted on image data sets(Cifar100 and Imagenet50) to verify that the proposed method improves the data privacy security of the system and improves the local performance of the client.

Key words: Decentralized federated learning, Datasecurity, Continual learning, Meta learning

CLC Number: 

  • R318
[1]LIN J,YU W,ZHANG N,et al.A Survey on Internet ofThings:Architecture,Enabling Technologies,Security and Privacy,and Applications[J].IEEE Internet of Things Journal,2017,4(5):1125-1142.
[2]ZHU H,XU J,LIU S,et al.Federated learning on non-IID data:A survey[J].Neurocomputing,2021,465:371-390.
[3]RAHMAN S A,TOUT H,MOURAD A,et al.FedMCCS:Multicriteria Client Selection Model for Optimal IoT Federated Learning[J].IEEE Internet of Things Journal,2021,8(6):4723-4735.
[4]YANG Q,LIU Y,CHEN T,et al.Federated Machine Learning:Concept and Applications[J].ACM Trans.Intell.Syst.Technol.,2019,10(2):12:1-12:19.
[5]MAI Z,LI R,JEONG J,et al.Online continual learning in image classification:An empirical survey[J].Neurocomputing,2022,469:28-51.
[6]LIU L Y,QIAN H,XING H J,et al.Incremental Classification Model Based on Q-Learning Algorithm[J].Computer Science,2020,47(8):171-177.
[7]QI X M,WU Y B,JIANG X L.Federated Data Augmentation Algorithm for Non-independent and Identical Distributed Data[J].Computer Science,2022,49(12):33-39.
[8]LI Z Z,HOLEM D.Learning without forgetting[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2017,40(2):2935-2947.
[9]YAO X,SUN L.Continual Local Training For Better Initialization Of Federated Models[C]//2020 IEEE International Confe-rence on Image Processing(ICIP).Abu Dhabi,United Arab Emirates:IEEE,2020:1736-1740.
[10]YOON J,JEONG W,LEE G,et al.Federated Continual Learning with Weighted Inter-client Transfer[C]//Proceedings of the 38th International Conference on Machine Learning(ICML).Virtual:PMLR,2021:12073-12086.
[11]CASADO F,LEMA D,LGLESIAS R,et al.Federated and continual learning for classification tasks in a society of devices[J].arXiv:2006.07129,2020.
[12]MCMAHAN B,MOORE E,RAMAGE D,et al.Communica-tion-Efficient Learning of Deep Networks from Decentralized Data[C]//Artificial Intelligence and Statistics(AISTATS).Ft.Lauderdale,FL,USA:PMLR,2017:1273-1282.
[13]LI T,SAHU A,ZAHEER M,et al.Federated optimization in heterogeneous networks[J].Proceedings of Machine Learning and Systems,2020,2:429-450.
[14]ZANTEDESCHI V,BELLET A,TOMMASI M.Fully Decentralized Joint Learning of Personalized Models and Collaboration Graphs[C]//The 23rd International Conference on Artificial Intelligence and Statistics(AISTATS 2020).Palermo,Sicily,Italy:PMLR,2020:864-874.
[15]HEGEDUS I,DANNER G,JELASITY M.Decentralized lear-ning works:An empirical comparison of gossip learning and fe-derated learning[J].Journal of Parallel and Distributed Computing,2021(148):109-124.
[16]WARNET S,SCHULTZE H,SHASTR K L,et al.SwarmLearning for decentralized and confidential clinical machine learning[J].Nature,2021,594(7862):265-270.
[17]HEGEDUS I,DANNER G,JELASITY M.Gossip Learning as a Decentralized Alternative to Federated Learning[C]//Distributed Applications and Interoperable Systems:19th IFIP WG 6.1 International Conference(DAIS).Kongens Lyngby,Denmark:Springer International Publishing,2019:74-90.
[18]MOTHUKURI V,PARIZI R,POURIYEH S,et al.A survey on security and privacy of federated learning[J].Future Gener.Comput.Syst.,2021,115:619-640.
[19]LIU Y,LIU J,BASAR T.Differentially Private Gossip Gradient Descent[C]//2018 IEEE Conference on Decision and Control(CDC).Fontainebleau Miami Beach,USA:IEEE,2018:2777-2782.
[20]THRUN S,PRATT L.Lifelong Learning Algorithms[C]//Learning to Learn.Boston:Springer,1998:181-209.
[21]VEN G,TOLAIS A.Three scenarios for continual learning[J].arXiv:1904.07734,2019.
[22]LANGE M,ALJUNDI R,MASANA M,et al.A ContinualLearning Survey:Defying Forgetting in Classification Tasks[J].IEEE Trans.Pattern Anal.Mach.Intell.,2022,44(7):3366-3385.
[23]MAI Z,LI R,JEONG J,et al.Online continual learning in image classification:An empirical survey[J].Neurocomputing,2022,469:28-51.
[24]LIU H,YANG Y,WANG X.Overcoming Catastrophic Forgetting in Graph Neural Networks[C]//Proceedings of the AAAI Conference on Artificial Intelligence.Virtual:AAAI,2021:8653-8661.
[25]SERRA J,SURIS D,MIRON M,et al.Overcoming catastrophic forgetting with hard attention to the task[C]//Proceedings of the 35th International Conference on Machine Learning(IC-ML).Stockholmsmässan,Sweden:PMLR,2018:4548-4557.
[26]ZHOU F,CAO C.Overcoming Catastrophic Forgetting in Graph Neural Networks with Experience Replay[C]//Proceedings of the AAAI Conference on Artificial Intelligence.Virtual:AAAI,2021:4714-4722.
[27]RIEMER M,CASES I,AJEMIAN R,et al.Learning to Learn without Forgetting by Maximizing Transfer and Minimizing Interference[C]//7th International Conference on Learning Representations.New Orleans,USA:ICLR,2019:1-31.
[28]ZHANG C Y,SI S J,WANG J Z,et al.Federated Meta Lear-ning:A Review[J].Big Data Research,2023,9(2):112-146.
[29]LI F C,LIU Y,WU P X.A Survey onRecent Advances in Meta-Learning[J].Chinese Journal of Computer,2021,44(2):422-446.
[30]FINN C,ABBEL P,LEVIINE S.Model-Agnostic Meta-Lear-ning for Fast Adaptation of Deep Networks[C]//International Conference on Machine Learning(ICML).Sydney,Australia:PMLR,2017:1126-1135.
[1] WANG Xiangwei, HAN Rui, Chi Harold LIU. Hierarchical Memory Pool Based Edge Semi-supervised Continual Learning Method [J]. Computer Science, 2023, 50(2): 23-31.
[2] LIU Yang, LI Fan-zhang. Fiber Bundle Meta-learning Algorithm Based on Variational Bayes [J]. Computer Science, 2022, 49(3): 225-231.
[3] XIE Yu, YANG Rui-ling, LIU Gong-xu, LI De-yu, WANG Wen-jian. Human Skeleton Action Recognition Algorithm Based on Dynamic Topological Graph [J]. Computer Science, 2022, 49(2): 62-68.
[4] LU Jia-you, LING Xing-hong, LIU Quan, ZHU Fei. Meta-reinforcement Learning Algorithm Based on Automating Policy Entropy [J]. Computer Science, 2021, 48(6): 168-174.
[5] WANG Yi-feng, GUO Yuan-bo, LI Tao, KONG Jing. Method for Unknown Insider Threat Detection with Small Samples [J]. Computer Science, 2019, 46(11A): 496-501.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!