计算机科学 ›› 2024, Vol. 51 ›› Issue (3): 39-47.doi: 10.11896/jsjkx.230700186
刘炜1,2,3, 刘宇昭1,3, 唐琮轲1,3, 王媛媛5, 佘维1,3,4, 田钊1,3
LIU Wei1,2,3, LIU Yuzhao1,3, TANG Congke1,3, WANG Yuanyuan5, SHE Wei1,3,4, TIAN Zhao1,3
摘要: 零散、孤立的海量数据形成“数据孤岛”使得数据无法交互和连接,如何在保护原始数据隐私的前提下安全有效地共享数据中的知识信息已成为热点研究问题。基于以上内容,提出了一种基于区块链的联邦蒸馏数据共享模型(BFDS)。区别于中心化架构,采用区块链联合多参与方组建教师网络,实现分布式协同工作;通过交换蒸馏输出的方式,传递数据中的知识信息,联合训练轻量化模型;提出了一种多权重节点可信评估算法,调用智能合约分配权重并生成可溯源全局软标签,降低因参与方质量差异而产生的负向影响。实验结果表明,BFDS模型能联合多参与方安全可信共享数据知识,协同蒸馏训练模型,降低了模型的部署成本;所提出的多权重节点评估算法能有效减小低质量节点的负向影响,提高了全局软标签的质量与安全性。
中图分类号:
[1]SAMUEL A L.Some studies in machine learning using the game of checkers [J]. IBM Journal of Research and Development,2000,44(1/2):206-226. [2]MCMAHAN H B,MOORE E,RAMAGE D,et al.Federatedlearning of deep networks using model averaging[J].arXiv:1602.05629,2016,2. [3]ZHOU X,XU M,WU Y,et al.Deep model poisoning attack on federated learning[J].Future Internet,2021,13(3):73. [4]LIU W,TANG C K,MA J,et al.Application Research and Progress of Blockchain in Privacy Computing [J].Journal of Zhengzhou University(Natural Science Edition),2021,38(3):675-679. [5]YUAN Y,WANG F Y.Development status and prospect ofblockchain technology[J].Acta Automatica Sinica,2016,42(4):481-494. [6]MORA A,TENISON I,BELLAVISTA P,et al.Knowledge Distillation for Federated Learning:a Practical Guide[J].arXiv:2211.04742,2022. [7]GEOFFREY H,ORIOL V,JEFF D.Distilling the knowledge in a neural network[J].arXiv:1503.02531,2015. [8]WU C,WU F,LYU L,et al.Communication-efficient federated learning via knowledge distillation[J].Nature Communications,2022,13(1):2032. [9]XU Z,HSU Y C,HUANG J.Training shallow and thin networks for acceleration via knowledge distillation with condi-tional adversarial networks[J].arXiv:1709.00513,2017. [10]CHEN M,ZHANG L,MA T Y.Recommendation ApproachBased on Attentive Federated Distillation [J].Journal of Software,2021,32(12):3852-3868. [11]SUN Y H,SHI Y H,LI M,et al.Personalized Federated Lear-ning Method Based on Collation Game and Knowledge Distillation[J].Journal of Electronics & Information Technology,2023,45(10):3702-3709. [12]CHENG X M,DENG C H.Compression Algorithm of Face Re-cognition Model Based on Unlabeled Knowledge Distillation[J].Computer Science,2022,49(6):245-253. [13]MO Z,GAO Z,ZHAO C,et al.FedDQ:A communication-efficient federated learning approach for Internet of Vehicles[J].Journal of Systems Architecture,2022,131:102690. [14]YAO D,PAN W,DAI Y,et al.Local-global knowledge distil-lation in heterogeneous federated learning with non-iid data[J].arXiv:2107.00051,2021. [15]LI N,ZHANG R,ZHU C,et al.A data sharing method for remote medical system based on federated distillation learning and consortium blockchain[J].Connection Science,2023,35(1):2186315. [16]LI Y,ZHANG J,ZHU J,et al.HBMD-FL:Heterogeneous Fe-derated Learning Algorithm Based on Blockchain and Model Distillation[C]//Emerging Information Security and Applications:Third International Conference(EISA 2022).Cham:Springer Nature Switzerland,2023:145-159. [17]MENG X F,LIU F,LI G,et al.Review of Knowledge Distillation in Convolutional Neural Network Compression[J].Journal of Frontiers of Computer Science and Technology,2021,15(10):1812-1829. [18]LI H.Statistical learning methods[M].Beijing:Tsinghua University Press,2019. [19]ZHU L G, LIU Z J,HAN S.Deep Leakage from Gradients[C]//Neural Information Processing Systems.2019. [20]ZHOU C Y,CHEN D W,WANG S,et al.Research and Challenge of Distributed Deep Learning Privacy and Security Attack[J].Journal of Computer Research and Development,2021,58(5):927-943. [21]HE Y Z,HU X B,HE J W,et al.Privacy and Security Issues in Machine Learning System:A Survey[J].Journal of Computer Research and Development,2019,56(10):2049-2070. [22]QIU X Y,YE Z C,CUI X L,et al.Survey of communition overhead of federated learning[J].Journal of Computer Applications,2022,42(2):333-342. |
|