Computer Science ›› 2023, Vol. 50 ›› Issue (11A): 230300012-7.doi: 10.11896/jsjkx.230300012
• Artificial Intelligence • Previous Articles Next Articles
ZHANG Yu1, CAO Xiqing2,3, NIU Saisai2,3, XU Xinlei1, ZHANG Qian1, WANG Zhe1
CLC Number:
[1]AKHTAR N,MIAN A.Threat of adversarial attacks on deep learning in computer vision:A survey[J].IEEE Access,2018,6:14410-14430. [2]HE T,ZHANG Z,ZHANG H,et al.Bag of tricks for image classification with convolutional neural networks[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2019:558-567. [3]BAEK S H,HEIDE F.Polka lines:Learning structured illumination and reconstruction for active stereo[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2021:5757-5767. [4]FAYEK H M,LECH M,CAVEDON L.Evaluating deep learning architectures for speech emotion recognition[J].Neural Networks,2017,92:60-68. [5]LIPPI M,MONTEMURRO M A,DEGLI ESPOSTI M,et al.Natural language statistical features of LSTM-generated texts[J].IEEE Transactions on Neural Networks and Learning Systems,2019,30(11):3326-3337. [6]DABBAGH N,CASTANEDA L.The PLE as a framework for developing agency in lifelong learning[J].Educational Technology Research and Development,2020,68(6):3041-3055. [7]ZHANG T,WANG X,LIANG B,et al.Catastrophic interference in reinforcement learning:A solution based on context division and knowledge distillation[J].IEEE Transactions on Neural Networks and Learning Systems,2022:1-15. [8]MASANA M,LIU X,TWARDOWSKI B,et al.Class-incremental learning:survey and performance evaluation on image classification[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2022,45(5):5513-5533. [9]CHEN H,WANG Y,HU Q.Multi-Granularity Regularized Re-Balancing for Class Incremental Learning[J].IEEE Transactions on Knowledge and Data Engineering,2023,35(7):7263-7277. [10]LI K,WAN J,YU S.CKDF:Cascaded knowledge distillation framework for robust incremental learning[J].IEEE Transactions on Image Processing,2022,31:3825-3837. [11]LIN H,FENG S,LI X,et al.Anchor Assisted Experience Replay for Online Class-Incremental Learning[J].IEEE Transactions on Circuits and Systems for Video Technology,2022,33(5):2217-2232. [12]LIN G,CHU H,LAI H.Towards better plasticity-stabilitytrade-off in incremental learning:a simple linear connector[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2022:89-98. [13]KIRKPATRICK J,PASCANU R,RABINOWITZ N,et al.Overcoming catastrophic forgetting in neural networks[J].Proceedings of the national academy of sciences,2017,114(13):3521-3526. [14]ALJUNDI R,BABILONI F,ELHOSEINY M,et al.Memoryaware synapses:Learning what (not) to forget[C]//Proceedings of the European Conference on Computer Vision.2018:139-154. [15]JOSEPH K J,KHAN S,KHAN F S,et al.Energy-based Latent Aligner for Incremental Learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2022:7452-7461. [16]YOON J,YANG E,LEE J,et al.Lifelong Learning with Dy-namically Expandable Networks[C]//International Conference on Learning Representations.2018. [17]YAN S,XIE J,HE X.Der:Dynamically expandable representation for class incremental learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2021:3014-3023. [18]MALLYA A,DAVIS D,LAZEBNIK S.Piggyback:Adapting a single network to multiple tasks by learning to mask weights[C]//Proceedings of the European Conference on Computer Vision.2018:67-82. [19]REBUFFI S A,KOLESNIKOV A,SPERL G,et al.icarl:Incremental classifier and representation learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2017:2001-2010. [20]SHIN H,LEE J K,KIM J,et al.Continual learning with deep generative replay[C]//Advances in Neural Information Processing Systems.2017:1-10. [21]BELOUADAH E,POPESCU A.Il2m:Class incremental learning with dual memory[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision.2019:583-592. [22]ZHU F,ZHANG X Y,WANG C,et al.Prototype augmentation and self-supervision for incremental learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2021:5871-5880. [23]YU L,TWARDOWSKI B,LIU X,et al.Semantic drift compensation for class-incremental learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2020:6982-6991. [24]LEE H,HWANG S J,SHIN J.Self-supervised label augmentation via input transformations[C]//International Conference on Machine Learning.2020:5714-5724. [25]YANG K,YAU J H,LI F F,et al.A study of face obfuscation in imagenet[C]//International Conference on Machine Learning.2022:25313-25330. [26]ULLAH A,ELAHI H,SUN Z,et al.Comparative analysis ofAlexNet,ResNet18 and SqueezeNet with diverse modification and arduous implementation[J].Arabian Journal for Science and Engineering,2022,47:2397-2417. [27]CASTRO F M,MARÍN-JIMÉNEZ M J,GUIL N,et al.End-to-end incremental learning[C]//Proceedings of the European Conference on Computer Vision.2018:233-248. [28]HOU S,PAN X,LOY C C,et al.Learning a unified classifier in-crementally via rebalancing[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2019:831-839. [29]CHATZIMPARMPAS A,MARTINS R M,KERREN A.t-visne:Interactive assessment and interpretation of t-sne projections[J].IEEE Transactions on Visualization and Computer Graphics,2020,26(8):2696-2714. |
[1] | ZHAO Ran, YUAN Jiabin, FAN Lili. Medical Ultrasound Image Super-resolution Reconstruction Based on Video Multi-frame Fusion [J]. Computer Science, 2023, 50(7): 143-151. |
[2] | ZHAO Jiangjiang, WANG Yang, XU Yingying, GAO Yang. Extractive Automatic Summarization Model Based on Knowledge Distillation [J]. Computer Science, 2023, 50(6A): 210300179-7. |
[3] | GUO Wei, HUANG Jiahui, HOU Chenyu, CAO Bin. Text Classification Method Based on Anti-noise and Double Distillation Technology [J]. Computer Science, 2023, 50(6): 251-260. |
[4] | ZHOU Shijin, XING Hongjie. Novelty Detection Method Based on Knowledge Distillation and Efficient Channel Attention [J]. Computer Science, 2023, 50(11A): 220900034-10. |
[5] | WAN Xu, MAO Yingchi, WANG Zibo, LIU Yi, PING Ping. Similarity and Consistency by Self-distillation Method [J]. Computer Science, 2023, 50(11): 259-268. |
[6] | LIU Dong-mei, XU Yang, WU Ze-bin, LIU Qian, SONG Bin, WEI Zhi-hui. Incremental Object Detection Method Based on Border Distance Measurement [J]. Computer Science, 2022, 49(8): 136-142. |
[7] | CHU Yu-chun, GONG Hang, Wang Xue-fang, LIU Pei-shun. Study on Knowledge Distillation of Target Detection Algorithm Based on YOLOv4 [J]. Computer Science, 2022, 49(6A): 337-344. |
[8] | CHENG Xiang-ming, DENG Chun-hua. Compression Algorithm of Face Recognition Model Based on Unlabeled Knowledge Distillation [J]. Computer Science, 2022, 49(6): 245-253. |
[9] | XIE Yu, YANG Rui-ling, LIU Gong-xu, LI De-yu, WANG Wen-jian. Human Skeleton Action Recognition Algorithm Based on Dynamic Topological Graph [J]. Computer Science, 2022, 49(2): 62-68. |
[10] | HUANG Yu-jiao, ZHAN Li-chao, FAN Xing-gang, XIAO Jie, LONG Hai-xia. Text Classification Based on Knowledge Distillation Model ELECTRA-base-BiLSTM [J]. Computer Science, 2022, 49(11A): 211200181-6. |
[11] | XIAO Zheng-ye, LIN Shi-quan, WAN Xiu-an, FANGYu-chun, NI Lan. Temporal Relation Guided Knowledge Distillation for Continuous Sign Language Recognition [J]. Computer Science, 2022, 49(11): 156-162. |
[12] | MIAO Zhuang, WANG Ya-peng, LI Yang, WANG Jia-bao, ZHANG Rui, ZHAO Xin-xin. Robust Hash Learning Method Based on Dual-teacher Self-supervised Distillation [J]. Computer Science, 2022, 49(10): 159-168. |
[13] | HUANG Zhong-hao, YANG Xing-yao, YU Jiong, GUO Liang, LI Xiang. Mutual Learning Knowledge Distillation Based on Multi-stage Multi-generative Adversarial Network [J]. Computer Science, 2022, 49(10): 169-175. |
[14] | YU Liang, WEI Yong-feng, LUO Guo-liang, WU Chang-xing. Knowledge Distillation Based Implicit Discourse Relation Recognition [J]. Computer Science, 2021, 48(11): 319-326. |
[15] | WANG Run-zheng, GAO Jian, HUANG Shu-hua, TONG Xin. Malicious Code Family Detection Method Based on Knowledge Distillation [J]. Computer Science, 2021, 48(1): 280-286. |
|