Computer Science ›› 2024, Vol. 51 ›› Issue (3): 39-47.doi: 10.11896/jsjkx.230700186

• Information Security Protection in New Computing Mode • Previous Articles     Next Articles

Study on Blockchain Based Federated Distillation Data Sharing Model

LIU Wei1,2,3, LIU Yuzhao1,3, TANG Congke1,3, WANG Yuanyuan5, SHE Wei1,3,4, TIAN Zhao1,3   

  1. 1 School of Cyber Science and Engineering,Zhengzhou University,Zhengzhou 450002,China
    2 Henan Key Laboratory of Network Cryptography Technology(Information Engineering University),Zhengzhou 450000,China
    3 Zhengzhou Key Laboratory of Blockchain and Data Intelligence(Zhengzhou University),Zhengzhou 450000,China
    4 Songshan Laboratory,Zhengzhou 450000,China
    5 State Grid Henan Electric Power Company,Xuchang,Henan 461000,China
  • Received:2023-07-24 Revised:2023-11-30 Online:2024-03-15 Published:2024-03-13
  • About author:LIU Wei,born in 1981,Ph.D,associate professor,Ph.D supervisor,is a member of CCF(No.49811M).His main research interests include blockchain technology,privacy protection and smart healthcare.TIAN Zhao,born in 1985,Ph.D, asso-ciate professor,is a member of CCF(No.K7436M).His main research interests include blockchain technology,information security,and intelligent transport.
  • Supported by:
    Henan Province University Science and Technology Innovation Talent Support Plan(21HASTIT031),Research Project of Henan Provincial Key Laboratory of Network Cryptography Technology(LNCT2022-A04), Key Scientific Research Project of Colleges and Universities in Henan Province(24A520045) and Songshan Laboratory Pre-research Project(YYYY022022003)

Abstract: The privacy of raw data makes it difficult to be directly shared among multiple participants.The issue of data security sharing and privacy-preserving has become a hot research topic.To solve this problem,this paper proposes a blockchain-based bederated distillation data sharing model(BFDS).It utilizes blockchain to form a collaborative teacher network with multiple participants.Through distilled output exchange,the knowledge from complex teacher networks is transferred and used to train lightweight models.A novel multi-weight node trust evaluation algorithm is proposed that uses smart contracts to generate traceable global soft labels.It can reduce the negative impact caused by quality differences among participants.Experimental results show that BFDS can collaborate with multiple parties to share data knowledge reliably,distill training models collaboratively,and reduce model deployment costs.The proposed algorithm can effectively reduce the negative impact of low-quality nodes and improve the quality and security of global soft labels.

Key words: Blockchain, Knowledge distillation, Data sharing, Smart contracts

CLC Number: 

  • TP391
[1]SAMUEL A L.Some studies in machine learning using the game of checkers [J]. IBM Journal of Research and Development,2000,44(1/2):206-226.
[2]MCMAHAN H B,MOORE E,RAMAGE D,et al.Federatedlearning of deep networks using model averaging[J].arXiv:1602.05629,2016,2.
[3]ZHOU X,XU M,WU Y,et al.Deep model poisoning attack on federated learning[J].Future Internet,2021,13(3):73.
[4]LIU W,TANG C K,MA J,et al.Application Research and Progress of Blockchain in Privacy Computing [J].Journal of Zhengzhou University(Natural Science Edition),2021,38(3):675-679.
[5]YUAN Y,WANG F Y.Development status and prospect ofblockchain technology[J].Acta Automatica Sinica,2016,42(4):481-494.
[6]MORA A,TENISON I,BELLAVISTA P,et al.Knowledge Distillation for Federated Learning:a Practical Guide[J].arXiv:2211.04742,2022.
[7]GEOFFREY H,ORIOL V,JEFF D.Distilling the knowledge in a neural network[J].arXiv:1503.02531,2015.
[8]WU C,WU F,LYU L,et al.Communication-efficient federated learning via knowledge distillation[J].Nature Communications,2022,13(1):2032.
[9]XU Z,HSU Y C,HUANG J.Training shallow and thin networks for acceleration via knowledge distillation with condi-tional adversarial networks[J].arXiv:1709.00513,2017.
[10]CHEN M,ZHANG L,MA T Y.Recommendation ApproachBased on Attentive Federated Distillation [J].Journal of Software,2021,32(12):3852-3868.
[11]SUN Y H,SHI Y H,LI M,et al.Personalized Federated Lear-ning Method Based on Collation Game and Knowledge Distillation[J].Journal of Electronics & Information Technology,2023,45(10):3702-3709.
[12]CHENG X M,DENG C H.Compression Algorithm of Face Re-cognition Model Based on Unlabeled Knowledge Distillation[J].Computer Science,2022,49(6):245-253.
[13]MO Z,GAO Z,ZHAO C,et al.FedDQ:A communication-efficient federated learning approach for Internet of Vehicles[J].Journal of Systems Architecture,2022,131:102690.
[14]YAO D,PAN W,DAI Y,et al.Local-global knowledge distil-lation in heterogeneous federated learning with non-iid data[J].arXiv:2107.00051,2021.
[15]LI N,ZHANG R,ZHU C,et al.A data sharing method for remote medical system based on federated distillation learning and consortium blockchain[J].Connection Science,2023,35(1):2186315.
[16]LI Y,ZHANG J,ZHU J,et al.HBMD-FL:Heterogeneous Fe-derated Learning Algorithm Based on Blockchain and Model Distillation[C]//Emerging Information Security and Applications:Third International Conference(EISA 2022).Cham:Springer Nature Switzerland,2023:145-159.
[17]MENG X F,LIU F,LI G,et al.Review of Knowledge Distillation in Convolutional Neural Network Compression[J].Journal of Frontiers of Computer Science and Technology,2021,15(10):1812-1829.
[18]LI H.Statistical learning methods[M].Beijing:Tsinghua University Press,2019.
[19]ZHU L G, LIU Z J,HAN S.Deep Leakage from Gradients[C]//Neural Information Processing Systems.2019.
[20]ZHOU C Y,CHEN D W,WANG S,et al.Research and Challenge of Distributed Deep Learning Privacy and Security Attack[J].Journal of Computer Research and Development,2021,58(5):927-943.
[21]HE Y Z,HU X B,HE J W,et al.Privacy and Security Issues in Machine Learning System:A Survey[J].Journal of Computer Research and Development,2019,56(10):2049-2070.
[22]QIU X Y,YE Z C,CUI X L,et al.Survey of communition overhead of federated learning[J].Journal of Computer Applications,2022,42(2):333-342.
[1] WANG Xu, LIU Changhong, LI Shengchun, LIU Shuang, ZHAO Kangting, CHEN Liang. Study on Manufacturing Company Automated Chart Analysis Method Based on Natural LanguageGeneration [J]. Computer Science, 2024, 51(4): 174-181.
[2] LI Fengyun, CHEN Mingming, WANG Lin, LI Peng , JU Xianyin. Study on Trust Management Mechanism of Internet of Vehicles Based on Blockchain [J]. Computer Science, 2024, 51(4): 381-387.
[3] WANG Dong, LI Zheng, XIAO Bingbing. Blockchain Coin Mixing Scheme Based on Homomorphic Encryption [J]. Computer Science, 2024, 51(3): 335-339.
[4] CHEN Jinyin, LI Xiao, JIN Haibo, CHEN Ruoxi, ZHENG Haibin, LI Hu. CheatKD:Knowledge Distillation Backdoor Attack Method Based on Poisoned Neuronal Assimilation [J]. Computer Science, 2024, 51(3): 351-359.
[5] DONG Hao, ZHAO Hengtai, WANG Ziyao, YUAN Ye, ZHANG Aoqian. Parallel Transaction Execution Models Under Permissioned Blockchains [J]. Computer Science, 2024, 51(1): 124-132.
[6] TONG Fei, SHAO Ranran. Study on Blockchain Based Access Control Model for Cloud Data [J]. Computer Science, 2023, 50(9): 16-25.
[7] ZHAO Mingmin, YANG Qiuhui, HONG Mei, CAI Chuang. Smart Contract Fuzzing Based on Deep Learning and Information Feedback [J]. Computer Science, 2023, 50(9): 117-122.
[8] WANG Junlu, LIU Qiang, ZHANG Ran, JI Wanting, SONG Baoyan. Blockchain-based Dual-branch Structure Expansion Model [J]. Computer Science, 2023, 50(8): 365-371.
[9] ZHAO Ran, YUAN Jiabin, FAN Lili. Medical Ultrasound Image Super-resolution Reconstruction Based on Video Multi-frame Fusion [J]. Computer Science, 2023, 50(7): 143-151.
[10] ZHAO Jiangjiang, WANG Yang, XU Yingying, GAO Yang. Extractive Automatic Summarization Model Based on Knowledge Distillation [J]. Computer Science, 2023, 50(6A): 210300179-7.
[11] YANG Jian, WANG Kaixuan. Tripartite Evolutionary Game Analysis of Medical Data Sharing Under Blockchain Architecture [J]. Computer Science, 2023, 50(6A): 221000080-7.
[12] TAN Pengliu, WANG Runshu, ZENG Wenhao, WANG Shikun, ZOU Wenshi. Overview of Blockchain Consensus Algorithms [J]. Computer Science, 2023, 50(6A): 220400200-12.
[13] HUANG Baohua, PENG Li, ZHAO Weihong, CHEN Ningjiang. Practical Byzantine Consensus Algorithm Based on Verifiable Random Functions [J]. Computer Science, 2023, 50(6A): 220300064-6.
[14] LIN Feilong, YUE Yuedong, ZHENG Jianhui, CHEN Zhongyu, LI Minglu. Blockchain-based Identity Authentication and Authorization Mechanism [J]. Computer Science, 2023, 50(6A): 220700158-9.
[15] PAN Lu, LUO Tao, NIU Xinzheng. Restart and Recovery Algorithm Based on Distributed Cluster Nodes [J]. Computer Science, 2023, 50(6A): 220300205-6.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!