Journal of Applied Sciences ›› 2022, Vol. 40 ›› Issue (5): 838-849.doi: 10.3969/j.issn.0255-8297.2022.05.013
• Computer Science and Applications • Previous Articles Next Articles
JIANG Xiaoyong1,2, LI Zhongyi1, HUANG Langyue1, PENG Mengle1, XU Shuyang1
Received:
2021-09-12
Online:
2022-09-30
Published:
2022-09-30
CLC Number:
JIANG Xiaoyong, LI Zhongyi, HUANG Langyue, PENG Mengle, XU Shuyang. Review of Neural Network Pruning Techniques[J]. Journal of Applied Sciences, 2022, 40(5): 838-849.
[1] Lecun Y, Boser B, Denker J S, et al. Backpropagation applied to handwritten zip code recognition[J]. Neural Computation, 1989, 1(4):541-551. [2] Krizhevsky A, Sutskever I, Hinton G E. ImageNet classification with deep convolutional neural networks[J]. Communications of the ACM, 2017, 60(6):84-90. [3] Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition[DB/OL]. 2014[2021-09-21]. https://arxiv.org/abs/1409.1556. [4] Szegedy C, Liu W, Jia Y Q, et al. Going deeper with convolutions[J]. 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2015:1-9. [5] He K M, Zhang X Y, Ren S Q, et al. Deep residual learning for image recognition[C]//2016 IEEE Conference on Computer Vision and Pattern Recognition, 2016:770-778. [6] Lecun Y. Optimal brain damage[J]. Neural Information Proceeding Systems, 1990, 2(279):598-605. [7] Denil M, Shakibi B, Dinh L, et al. Predicting parameters in deep learning[DB/OL]. 2014[2021-09-12]. https://arxiv.org/abs/1306.0543. [8] Hassibi B, Stork D G. Second order derivatives for network pruning:optimal brain surgeon[C]//Advances in Neural Information Processing Systems, 1993:164-171. [9] Thimm G, Fiesler E. Evaluating pruning methods[J]. International Symposium on Artificial Neural Networks, 1995, A2:20-25. [10] Srinivas S, Babu R V. Data-free parameter pruning for deep neural networks[J]. Computer Science, 2015:2830-2838. [11] Han S, Pool J, Tran J, et al. Learning both weights and connections for efficient neural networks[DB/OL]. 2015[2021-09-12]. https://arxiv.org/abs/1506.02626. [12] Han S, Mao H, Dally W J. Deep compression:compressing deep neural networks with pruning, trained quantization and Huffman coding[DB/OL]. 2016[2021-09-12]. https://arxiv.org/abs/1510.00149. [13] Han S, Liu X Y, Mao H Z, et al. EIE:efficient inference engine on compressed deep neural network[J]. 2016 ACM/IEEE 43rd Annual International Symposium on Computer Architecture (ISCA), 2016:243-254. [14] Guo Y W, Yao A B, Chen Y R. Dynamic network surgery for efficient DNNs[J]. Advances in Neural Information Processing Systems, 2016:1379. [15] Hu H Y, Peng R, Tai Y W, et al. Network trimming:a data-driven neuron pruning approach towards efficient deep architectures[DB/OL]. 2016[2021-09-12]. https://arxiv.org/abs/1607.03250. [16] Louizos C, Welling M, Kingma D P. Learning sparse neural networks through L0 regularization[DB/OL]. 2017[2021-09-12]. https://arxiv.org/abs/1712.01312. [17] Lee N, Ajanthan T, Torr P H S. SNIP:single-shot network pruning based on connection sensitivity[DB/OL]. 2018[2021-09-12]. https://arxiv.org/abs/1810.02340. [18] Frankle J, Carbin M. The Lottery ticket hypothesis:finding sparse, trainable neural networks[DB/OL]. 2018[2021-09-12]. https://arxiv.org/abs/1803.03635. [19] Wang C, Zhang G, Grosse R. Picking winning tickets before training by preserving gradient flow[DB/OL]. 2020[2021-09-12]. https://arxiv.org/abs/2002.07376. [20] Anwar S, Hwang K, Sung W. Structured pruning of deep convolutional neural networks[J]. ACM Journal on Emerging Technologies in Computing Systems, 2017, 13(3):1-18. [21] Zhou A, Ma Y, Zhu J, et al. Learning N:M fine-grained structured sparse neural networks from scratch[DB/OL]. 2021[2021-09-12]. https://arxiv.org/abs/2102.04010. [22] Wen W, Wu C, Wang Y, et al. Learning structured sparsity in deep neural networks[DB/OL]. 2016[2021-09-12]. https://arxiv.org/abs/1608.03665. [23] Gordon A, Eban E, Nachum O, et al. MorphNet:fast & simple resource-constrained structure learning of deep networks[J]. 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2018:1586-1595. [24] Lebedev V, Lempitsky V. Fast ConvNets using group-wise brain damage[C]//2016 IEEE Conference on Computer Vision and Pattern Recognition, 2016:2554-2564. [25] Li H, Kadav A, Durdanovic I, et al. Pruning filters for efficient ConvNets[DB/OL]. 2017[2021-09-12]. https://arxiv.org/abs/1608.08710. [26] Luo J H, Wu J X, Lin W Y. ThiNet:a filter level pruning method for deep neural network compression[C]//2017 IEEE International Conference on Computer Vision (ICCV), 2017:5068-5076. [27] Molchanov P, Tyree S, Karras T, et al. Pruning convolutional neural networks for resource [28] Lin S H, Ji R R, Li Y C, et al. Toward compact ConvNets via structure-sparsity regularized filter pruning[J]. IEEE Transactions on Neural Networks and Learning Systems, 2020, 31(2):574-588. [29] Lin S H, Ji R R, Yan C Q, et al. Towards optimal structured CNN pruning via generative adversarial learning[C]//2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2019:2785-2794. [30] He Y, Liu P, Wang Z W, et al. Filter pruning via geometric Median for deep convolutional neural networks acceleration[C]//2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2019:4335-4344. [31] He Y, Dong X Y, Kang G L, et al. Asymptotic soft filter pruning for deep convolutional neural networks[J]. IEEE Transactions on Cybernetics, 2020, 50(8):3594-3604. [32] Lin M B, Ji R R, Wang Y, et al. HRank:filter pruning using high-rank feature map[C]//2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2020:1526-1535. [33] Zhu J H, Zhao Y, Pei J H. Progressive kernel pruning based on the information mapping sparse index for CNN compression[J]. IEEE Access, 2021, 9:10974-10987. [34] Polyak A, Wolf L. Channel-level acceleration of deep face representations[J]. IEEE Access, 2015, 3:2163-2175. [35] He Y H, Zhang X Y, Sun J. Channel pruning for accelerating very deep neural networks[C]//2017 IEEE International Conference on Computer Vision, 2017:1398-1406. [36] Yu R C, Li A, Chen C F, et al. NISP:pruning networks using neuron importance score propagation[C]//2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2018:9194-9203. [37] Liu Z, Li J G, Shen Z Q, et al. Learning efficient convolutional networks through network slimming[C]//2017 IEEE International Conference on Computer Vision, 2017:2755-2763. [38] Huang Z, Wang N. Data-driven sparse structure selection for deep neural networks[C]//European Conference on Computer Vision, 2018:317-334. [39] Zhuang Z W, Tan M K, Zhuang B H, et al. Discrimination-aware channel pruning for deep neural networks[DB/OL]. 2018[2021-09-12]. https://arxiv.org/abs/1810.11809. [40] Ye J B, Lu X, Lin Z, et al. Rethinking the smaller-norm-less-informative assumption in channel pruning of convolution layers[DB/OL]. 2018[2021-09-12]. https://arxiv.org/abs/1802.00124. [41] Ye Y, You G, Fwu J K, et al. Channel pruning via optimal thresholding[J]. Computer Vision and Pattern Recognition, 2020:508-516. [42] Liu Z, Sun M J, Zhou T H, et al. Rethinking the value of network pruning[DB/OL]. 2019[2021-09-12]. https://arxiv.org/abs/1810.05270. [43] Guo S P, Wang Y J, Li Q Q, et al. DMCP:differentiable Markov channel pruning for neural networks[C]//2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2020:1536-1544. [44] Chang J F, Lu Y, Xue P, et al. Automatic channel pruning via clustering and swarm intelligence optimization for CNN[J]. Applied Intelligence, 2022:1-21. [45] Howard A G, Zhu M, Chen B, et al. MobileNets:efficient convolutional neural networks for mobile vision applications[DB/OL]. 2017[2021-09-12]. https://arxiv.org/abs/1704.04861. [46] Yu J H, Yang L J, Xu N, et al. Slimmable neural networks[DB/OL]. 2019[2021-09-12]. https://arxiv.org/abs/1812.08928. [47] Yu J H, Huang T. Universally slimmable networks and improved training techniques[C]//2019 IEEE/CVF International Conference on Computer Vision (ICCV), 2019:1803-1811. [48] He Y H, Lin J, Liu Z J, et al. AMC:AutoML for model compression and acceleration on mobile devices[C]//European Conference on Computer Vision (ECCV), 2018:784-800. [49] Yu J, Huang T. AutoSlim:towards one-shot architecture search for channel numbers[DB/OL]. 2019[2021-09-12]. https://arxiv.org/abs/1903.11728. [50] Cai H, Gan C, Wang T Z, et al. Once-for-all:train one network and specialize it for efficient deployment[C]//International Conference on Learning Representations (ICLR), 2019. efficient transfer learning[DB/OL]. 2021[2021-09-12]. https://arxiv.org/abs/1611.06440. |
[1] | LUO Changyin, CHEN Xuebin, SONG Shangwen, ZHANG Shufen, LIU Zhiyu. Federated Ensemble Algorithm Based on Deep Neural Network [J]. Journal of Applied Sciences, 2022, 40(3): 493-510. |
[2] | QIU Bin, SUN Manman, CUI Suli. Energy-Saving Scheduling Algorithm for Multi-Variable Neighborhood Based on Pruning Optimization [J]. Journal of Applied Sciences, 2022, 40(2): 349-360. |
[3] | ZHU Li, YANG Qing, WU Tao, LI Chen, LI Ming. Emotional Analysis of Brain Waves Based on CNN and Bi-LSTM [J]. Journal of Applied Sciences, 2022, 40(1): 1-12. |
[4] | LEI Qianhui, PAN Lili, SHAO Weizhi, HU Haipeng, HUANG Yao. Segmentation Model of COVID-19 Lesions Based on Triple Attention Mechanism [J]. Journal of Applied Sciences, 2022, 40(1): 105-115. |
[5] | KANG Huixian, YI Biao, WU Hanzhou. Recent Advances in Text Steganography and Steganalysis [J]. Journal of Applied Sciences, 2021, 39(6): 923-938. |
[6] | OU Qiaofeng, XIAO Jiabing, XIE Qunqun, XIONG Bangshu. Multi-target Detection and Recognition for Vehicle Inspection Images Based on Deep Learning [J]. Journal of Applied Sciences, 2021, 39(6): 939-951. |
[7] | LI Wenju, HE Maoxian, ZHANG Yaoxing, CHEN Huiling, LI Peigang. Crack Detection of Track Slab Based on Convolutional Neural Network and Voting Mechanism [J]. Journal of Applied Sciences, 2021, 39(4): 627-640. |
[8] | GUO Yubo, LU Jun, DUAN Pengqi. Automatic Classification of Bamboo Flute Playing Skills Based on Deep Learning [J]. Journal of Applied Sciences, 2021, 39(4): 685-694. |
[9] | HAO Yan, SHI Huiyu, HUO Shoujun, HAN Dan, CAO Rui. Emotion Classification Based on EEG Deep Learning [J]. Journal of Applied Sciences, 2021, 39(3): 347-346. |
[10] | DU Chengze, DUAN Youxiang, SUN Qifeng. Seismic Fault Identification Method Based on ResUNet and Dense CRF Model [J]. Journal of Applied Sciences, 2021, 39(3): 367-366. |
[11] | WANG Wanguo, MU Shiyou, LIU Yue, LIU Guangxiu, LANG Fenling. Research on Insulator Self Exploding Detection in UAV Inspection Based on Deep Learning [J]. Journal of Applied Sciences, 2021, 39(2): 222-231. |
[12] | LIU Zhiyu, ZHANG Shufen, LIU Yang, LUO Changyin, LI Min. Data Augmentation Method Based on Image Gradient [J]. Journal of Applied Sciences, 2021, 39(2): 302-311. |
[13] | HU Enxiang, WANG Chunyu, PAN Meiqin. Clustering by Pruning Paths Based on Shortest Paths from Density Peaks [J]. Journal of Applied Sciences, 2020, 38(5): 792-802. |
[14] | WANG Canjun, LIAO Xin, CHEN Jiaxin, QIN Zheng, LIU Xuchong. Research on Facial Modification Detection Algorithm Based on Convolutional Neural Network [J]. Journal of Applied Sciences, 2019, 37(5): 618-630. |
[15] | ZHU Yiming, CHEN Fan, HE Hongjie, CHEN Hongyou. Orthogonal GAN Information Hiding Model Based on Secret Information Driven [J]. Journal of Applied Sciences, 2019, 37(5): 721-732. |
Viewed | ||||||
Full text |
|
|||||
Abstract |
|
|||||