Computer Science ›› 2021, Vol. 48 ›› Issue (6): 227-233.doi: 10.11896/jsjkx.200800016

• Artificial Intelligence • Previous Articles     Next Articles

Parallel Pruning from Two Aspects for VGG16 Optimization

LI Shan1,2, XU Xin-zheng1,2,3   

  1. 1 Engineering Research Center of Mine Digitalization of Ministry of Education,China University of Mining and Technology,Xuzhou,Jiangsu 221116,China
    2 School of Computer Science and Technology,China University of Mining and Technology,Xuzhou,Jiangsu 221116,China
    3 Key Laboratory of Opto-technology and Intelligent Control,Ministry of Education, Lanzhou Jiaotong University,Lanzhou 730070,China
  • Received:2020-08-02 Revised:2020-11-30 Online:2021-06-15 Published:2021-06-03
  • About author:LI Shan,born in 1995,postgraduate.His main research interests include deep learning and computer version.(lslishan@163.com)
    XU Xin-zheng,born in 1980,Ph.D,associate professor,is a senior member of China Computer Federation.His main research interests include machine learning,data mining and pattern recognition.
  • Supported by:
    National Natural Science Foundation of China (61976217),Opening Foundation of Key Laboratory of Opto-technology and Intelligent Control (Lanzhou Jiaotong University),Ministry of Education (KFKT2020-03),Fundamental Research Funds for the Central Universities (2019XKQYMS87),Assistance Program for Future Outstanding Talents of China University of Mining and Technology(2020WLJCRCZL058) and Postgraduate Research & Practice Innovation Program of Jiangsu Province(KYCX20_2055).

Abstract: In recent years,much of pruning for convolutional neural network is based on the norm value of the filter.The smaller the norm value is,the smaller the impact on the network after clipping.This idea can make full use of the numerical characteristics of the filter.However,it ignores the structural characteristics of the filter.Based on the above viewpoint,this paper applies AHCF(Agglomerative Hierarchical Clustering method for Filter) to vgg16.Then,a parallel pruning method from two aspects is proposed to prune the filter from both numerical and structural perspectives.This method reduces the redundant filters and the parameters in the VGG16 network.Besides,it improves the classification accuracy,meanwhile keeping the learning curve of the original network.On CIFAR10 dataset,the accuracy of the proposed method is 0.71% higher than that of the original VGG16 network.On MNIST,the accuracy of the proposed method is as good as the original network.

Key words: Clustering algorithm, Convolution neural network, Lightweight, Pruning, VGG

CLC Number: 

  • TP181
[1]DENTON E,ZAREMBA W,BRUNA J,et al.Exploiting Linear Structure Within Convolutional Networks for Efficient Evaluation[C]//Neural Information Processing Systems.MIT Press,2014:1269-1277.
[2]HE K,ZHANG X,REN S,et al.Deep Residual Learning forImage Recognition[C]//IEEE Conference on Computer Vision &Pattern Recognition.IEEE Computer Society,2016:770-778.
[3]HE Y,KANG G,DONG X,et al.Soft Filter Pruning for Accele-rating Deep Convolutional Neural Networks[C]//International Joint Conference on Artificial Intelligence.AAAI Press,2018:2234-2240.
[4]HAN S,POOL J,TRAN J,et al.Learning both weightsand con-nections for efficient neural network[C]//Neural Information Processing Systems Conference.MIT Press:2015,1135-1143.
[5]CARREIRA-PERPINAN M A,IDELBAYEV Y.learning-compression algorithms for neural net pruning[C]//2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition(CVPR).IEEE,2018:8532-8541.
[6]LI H,KADAV A,DURDANOVIC I,et al.Pruning filters for efficient ConvNets[J].arXiv:1608.08710,2016.
[7]YU R,LI A,DAVIS L S,et al.NISP:Pruning networks using neuron importance score propagation[C]//Computer Vision and Pattern Recognition.IEEE,2018:9194-9203.
[8]SZEGEDY C,LIU W,JIA Y,et al.Going deeper with convolutions[C]//Computer Vision and Pattern Recognition.IEEE,2015:1-9.
[9]HAN S,MAO H,DALLY W J.Deep Compression:Compres-sing Deep Neural Networks with Pruning,Trained Quantization and Huffman Coding[J].Fiber,2015,56(4):3-7.
[10]YE J B,LU X,LIN Z,et al.Rethinking the smaller-norm-less-informative assumption in channel pruning of convolution layers[J].arXiv:1802.00124,2018.
[11]LECUN Y,DENKER J S,SOLLA S A,et al.Optimal BrainDamage[C]//Advances in Neural Information Processing Systems.Massachusetts Institute of Technology Press,1989:598-605.
[12]HASSIBI B,STORK D G.Second order derivatives for network pruning:Optimal Brain Surgeon[C]//Neural Information Processing Systems.Massachusetts Institute of Technology Press.1992:164-171.
[13]HINTON G,VINYALS O,DEAN J.Distilling the Knowledgein a Neural Network[J].Computer Science,2015,14(7):38-39.
[14]YANG T,CHEN Y,SZE V,et al.Designing Energy-EfficientConvolutional Neural Networks Using Energy-Aware Pruning[C]//Computer Vision and Pattern Recognition.IEEE,2017:6071-6079.
[15]ZHUO H,QIAN X,FU Y,et al.SCSP:Spectral Clustering Filter Pruning with Soft Self-adaption Manners[J].arXiv:1806.05320,2018.
[16]WANG D,ZHOU L,ZHANG X N,et al.Exploring Linear Relationship in Feature Map Subspace for ConvNets Compression[J].arXiv:1803.05729,2018.
[17]BYUNGMIN A,KIM T.Deeper Weight Pruning without Accuracy Loss in Deep Neural Networks[C]//2020 Design,Automation & Test in Europe Conference.EDA Consortium,2020:73-78.
[18]HANSON S J,PRATT L Y.Comparing biases for minimal network construction with back-propagation[C]//Advances in Neural Information Processing Systems.MIT Press,1989:177-185.
[19]TUNG F,MORI G.Deep Neural Network Compression by In-Parallel Pruning-Quantization[J].IEEE Transactions on Pattern Analysis and Machine Intelligence ,2020:42(3):568-579.
[20]ZHANG T,YE S,ZHANG K,et al.A systematic dnn weight pruning framework using alternating direction method of multipliers[C]//Computer Vision and Pattern Recognition.Lecture Notes in Computer Science,2018:191-207.
[21]GUO Y,YAO A,CHEN Y,et al.Dynamic network surgery for efficient DNNs[C]//Neural Information Processing Systems.Curran Associates,Inc.,2016:1387-1395.
[22]DONG X,CHEN S,PAN S.Learning to prune deep neural networks via layer-wise optimal brain surgeon[C]//Advances in Neural Information Processing Systems.Curran Associates Inc.,2017:4857-4867.
[23]LIU Z,LI J,SHEN Z,et al.Learning Efficient Convolutional Networks through Network Slimming[C]//International Conference on Computer Vision.IEEE,2017:2755-2763.
[24]LIN M B,JI R R,WANG Y,et al.HRank:Filter Pruning using High-Rank Feature Map[J].arXiv:2002.10179.
[25]LUO J,WU J,LIN W,et al.ThiNet:A Filter Level Pruning Method for Deep Neural Network Compression[C]//International Conference on Computer Vision.IEEE,2017:5068-5076.
[26]HE Y,ZHANG X,SUN J,et al.Channel Pruning for Accelerating Very Deep Neural Networks[C]// International Confe-rence on Computer Vision.IEEE Computer Society,2017:1398-1406.
[27]MOLCHANOV P,TYREE S,KARRAS T,et al.Pruning Convo-lutional Neural Networks for Resource Efficient Inference[J].arXiv:1611.06440.
[28]DUBEY A,CHATTERJEE M,AHUJA N,et al.Coreset-Based Neural Network Compression[C]//European Conference on Computer Vision.Springer International Publishing,2018:469-486.
[29]SUAU X,ZAPPELLA L,PALAKKODE V,et al.Principal filter analysis for guided network compression[J].arXiv:1807.10585,2018.
[30]WANG D,ZHOU L,ZHANG X,et al.Exploring linear relationship in feature map subspace for convnets compression[J].ar-Xiv:1803.05729,2018.
[31]NG A Y,JORDAN M I,WEISS Y,et al.On Spectral Clustering:Analysis and an algorithm[C]//Neural Information Processing Systems.MIT Press,2001:849-856.
[32]KUNCHEVA L I,BEZDEK J C.Nearest prototype classifica-tion:clustering,genetic algorithms,or random search?[J].IEEE Transactions on Systems Man Cybernetics-Systems,1998,28(1):160-164.
[33]ZHU Z L,LIU Y J.A Novel Algorithm by Incorporating Chaos Optimization and Improved Fuzzy C-Means for Image Segmentation[J].Acta Electronica Sinica,2020,48(5):975-984.
[34]ZHANG C X,PEI L J,CHEN Z,et al.RGBD Scene Flow Estimation Based on FRFCM Clustering and Depth Optimization[J].Acta Electronica Sinica,2020,48(7):1380-1386.
[35]ZHANG Z Y,LIN J,MIAO R S.Hesitant fuzzy language agglomerative hierarchical clustering algorithm and its application [J].Statistics and Decision Making,2019,35(21):71-74.
[36]LU X K.Research on parallel processing of multi-source information fusion based on clustering algorithm [J].Journal of China Academy of Electronic Sciences,2019,14(3):251-255,264.
[37]BANDYOPADHYAY S,COYLE E J.An energy efficient hie-rarchical clustering algorithm for wireless sensor networks[C]//Twenty-Second Annual Joint Conference of the IEEE Computer and Communications(INFOCOM 2003).IEEE,2003:1713-1723.
[38]ZHAO Y,KARYPIS G.Evaluation of hierarchical clustering algorithms for document Datasets[C]//Conference on Information and Knowledge Management.Association for Computing Machinery.2002:515-524.
[39]LI G,YUAN X K,XU A D,et al.Application of artificial bee colony algorithm in big data based on MapReduce [J].Computer and Digital Engineering,2020,48(1):124-129,146.
[40]MURTAGH F,LEGENDRE P.Ward’s Hierarchical Agglome-rative Clustering Method:Which Algorithms Implement Ward’s Criterion?[J].Journal of Classification,2014,31(3):274-295.
[41]KRIZHEVSKY A,HINTON G.Learning multiple layers of features from tiny images[J].Handbook of Systemic Autoimmune Diseases,2009,1(4):1-60.
[42]LECUN Y,BOTTOU L,HAFFNER P.Gradient-based learning applied to document recognition[J].Proceedings of the IEEE,1998,86(11):2278-2324.
[43]SIMONYAN K,ZISSERMAN A.Very Deep Convolutional Networks for Large Scale Image Recognition[J].arXiv:1409.1556,2014.
[1] CHAI Hui-min, ZHANG Yong, FANG Min. Aerial Target Grouping Method Based on Feature Similarity Clustering [J]. Computer Science, 2022, 49(9): 70-75.
[2] ZHANG Ying-tao, ZHANG Jie, ZHANG Rui, ZHANG Wen-qiang. Photorealistic Style Transfer Guided by Global Information [J]. Computer Science, 2022, 49(7): 100-105.
[3] HAO Qiang, LI Jie, ZHANG Man, WANG Lu. Spatial Non-cooperative Target Components Recognition Algorithm Based on Improved YOLOv3 [J]. Computer Science, 2022, 49(6A): 358-362.
[4] WANG Shan, XU Chu-yi, SHI Chun-xiang, ZHANG Ying. Study on Cloud Classification Method of Satellite Cloud Images Based on CNN-LSTM [J]. Computer Science, 2022, 49(6A): 675-679.
[5] LI Si-quan, WAN Yong-jing, JIANG Cui-ling. Multiple Fundamental Frequency Estimation Algorithm Based on Generative Adversarial Networks for Image Removal [J]. Computer Science, 2022, 49(3): 179-184.
[6] LIU Yang, LI Fan-zhang. Fiber Bundle Meta-learning Algorithm Based on Variational Bayes [J]. Computer Science, 2022, 49(3): 225-231.
[7] ZHANG Ya-di, SUN Yue, LIU Feng, ZHU Er-zhou. Study on Density Parameter and Center-Replacement Combined K-means and New Clustering Validity Index [J]. Computer Science, 2022, 49(1): 121-132.
[8] HU Yan-li, TONG Tan-qian, ZHANG Xiao-yu, PENG Juan. Self-attention-based BGRU and CNN for Sentiment Analysis [J]. Computer Science, 2022, 49(1): 252-258.
[9] GONG Hao-tian, ZHANG Meng. Lightweight Anchor-free Object Detection Algorithm Based on Keypoint Detection [J]. Computer Science, 2021, 48(8): 106-110.
[10] CHEN Zhi-wen, WANG Kun, ZHOU Guang-yun, WANG Xu, ZHANG Xiao-dan, ZHU Hu-ming. SAR Image Change Detection Method Based on Capsule Network with Weight Pruning [J]. Computer Science, 2021, 48(7): 190-198.
[11] PAN Ming-yuan, SONG Hui-hui, ZHANG Kai-hua, LIU Qing-shan. Learning Global Guided Progressive Feature Aggregation Lightweight Network for Salient Object Detection [J]. Computer Science, 2021, 48(6): 103-109.
[12] TANG Xin-yao, ZHANG Zheng-jun, CHU Jie, YAN Tao. Density Peaks Clustering Algorithm Based on Natural Nearest Neighbor [J]. Computer Science, 2021, 48(3): 151-157.
[13] ZHAN Rui, LEI Yin-jie, CHEN Xun-min, YE Shu-han. Street Scene Change Detection Based on Multiple Difference Features Network [J]. Computer Science, 2021, 48(2): 142-147.
[14] LIU Yan, QIN Pin-le, ZENG Jian-chao. Multi-object Tracking Algorithm Based on YOLOv3 and Hierarchical Data Association [J]. Computer Science, 2021, 48(11A): 370-375.
[15] CHEN Hao-nan, LEI Yin-jie, WANG Hao. Lightweight Lane Detection Model Based on Row-column Decoupled Sampling [J]. Computer Science, 2021, 48(11A): 416-419.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!