iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://unpaywall.org/10.1007/S10489-023-04513-8
Dynamic connection pruning for densely connected convolutional neural networks | Applied Intelligence Skip to main content
Log in

Dynamic connection pruning for densely connected convolutional neural networks

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Densely connected convolutional neural networks dominate in a variety of downstream tasks due to their extraordinary performance. However, such networks typically require excessive computing resources, which hinders their deployment on mobile devices. In this paper, we propose a dynamic connection pruning algorithm, which is a cost-effective method to eliminate a large amount of redundancy in densely connected networks. First, we propose a Sample-Evaluation process to assess the contributions of connections. Specifically, sub-networks are sampled from the unpruned network in each epoch, while the parameters of the unpruned network are subsequently updated and the contributions of the connections are evaluated based on the performance of the sub-networks. Connections with low contribution will be pruned first. Then, we search for the distribution of pruning ratios by the Markov process. Finally, we prune the network based on the connection contribution and pruning ratios learned in the above two stages and obtain a lightweight network. The effectiveness of our method is verified on both high-level and low-level tasks. On the CIFAR-10 dataset, the top-1 accuracy barely drops (-0.03%) when FLOPs are reduced by 46.8%. In the super-resolution task, our model remarkably outperforms other lightweight networks in both visual and quantitative experiments. These results verify the effectiveness and generality of our proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Algorithm 1
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Han S, Pool J, Tran J, Dally W (2015) Learning both weights and connections for efficient neural network. Advances in Neural Information Processing Systems, vol 28

  2. Wen W, Wu C, Wang Y, Chen Y, Li H (2016) Learning structured sparsity in deep neural networks. Advances in Neural Information Processing Systems, vol 29

  3. Zurek D (2020) When deep learning models on gpu can be accelerated by taking advantage of unstructured sparsity. arXiv:2011.06295

  4. Wang H, Hu X, Zhang Q, Wang Y, Yu L, Hu H (2019) Structured pruning for efficient convolutional neural networks via incremental regularization. IEEE Journal of Selected Topics in Signal Processing 14(4):775–788

    Article  Google Scholar 

  5. Xia M, Zhong Z, Chen D (2022) Structured pruning learns compact and accurate models. In: Proceedings of the 60th annual meeting of the association for computational linguistics (Volume 1: Long Papers), pp 1513–1528

  6. Huang G, Liu Z, Van Der Maaten L, Weinberger KQ (2017) Densely connected convolutional networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 4700–4708

  7. Ji X, Cao Y, Tai Y, Wang C, Li J, Huang F (2020) Real-world super-resolution via kernel estimation and noise injection. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition workshops, pp 466–467

  8. Zhang Y, Tian Y, Kong Y, Zhong B, Fu Y (2020) Residual dense network for image restoration. IEEE Trans Pattern Anal Mach Intell 43(7):2480–2495

    Article  Google Scholar 

  9. Dong H, Pan J, Xiang L, Hu Z, Zhang X, Wang F, Yang M-H (2020) Multi-scale boosted dehazing network with dense feature fusion. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 2157–2167

  10. Zhang Y, Li K, Li K, Wang L, Zhong B, Fu Y (2018) Image super-resolution using very deep residual channel attention networks. In: Proceedings of the European Conference on Computer Vision (ECCV), pp 286–301

  11. Lim B, Son S, Kim H, Nah S, Mu Lee K (2017) Enhanced deep residual networks for single image super-resolution. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops, pp 136–144

  12. Li M, Lin J, Ding Y, Liu Z, Zhu J. -Y., Han S (2020) Gan compression: efficient architectures for interactive conditional gans. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 5284–5294

  13. Yu J, Huang TS (2019) Universally slimmable networks and improved training techniques. In: Proceedings of the IEEE/CVF international conference on computer vision, pp 1803–1811

  14. Kim H, Hong S, Han B, Myeong H, Lee KM (2022) Fine-grained neural architecture search for image super-resolution. J Vis Commun Image Represent 89:103654

    Article  Google Scholar 

  15. Liu H, Simonyan K, Yang Y (2018) Darts: differentiable architecture search. In: International conference on learning representations

  16. Zhang Y, Wang H, Qin C, Fu Y (2021) Aligned structured sparsity learning for efficient image super-resolution. Adv Neural Inf Process Syst, vol 34

  17. Luo J-H, Wu J (2020) Neural network pruning with residual-connections and limited-data. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 1458–1467

  18. Li T, Jiao W, Wang L-N, Zhong G (2020) Automatic densenet sparsification. IEEE Access 8:62561–62571

    Article  Google Scholar 

  19. O’Neill D, Xue B, Zhang M (2021) Evolutionary neural architecture search for high-dimensional skip-connection structures on densenet style networks. IEEE Trans Evol Comput 25(6):1118–1132

    Article  Google Scholar 

  20. François-Lavet V, Henderson P, Islam R, Bellemare MG, Pineau J et al (2018) An introduction to deep reinforcement learning. Foundations and Trends® in Machine Learning 11(3-4):219– 354

    Article  MATH  Google Scholar 

  21. Slowik A, Kwasnicka H (2020) Evolutionary algorithms and their applications to engineering problems. Neural Comput and Applic 32(16):12363–12379

    Article  Google Scholar 

  22. Zhang Y, Tian Y, Kong Y, Zhong B, Fu Y (2018) Residual dense network for image super-resolution. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2472–2481

  23. Tong T, Li G, Liu X, Gao Q (2017) Image super-resolution using dense skip connections. In: Proceedings of the IEEE international conference on computer vision, pp 4799–4807

  24. Tai Y, Yang J, Liu X, Xu C (2017) Memnet: a persistent memory network for image restoration. In: Proceedings of the IEEE international conference on computer vision, pp 4539–4547

  25. Shen M, Yu P, Wang R, Yang J, Xue L, Hu M (2019) Multipath feedforward network for single image super-resolution. Multimed Tools Appl 78(14):19621–19640

    Article  Google Scholar 

  26. Dong C, Loy CC, He K, Tang X (2015) Image super-resolution using deep convolutional networks. IEEE Trans Pattern Anal Mach Intell 38(2):295–307

    Article  Google Scholar 

  27. Dong C, Loy CC, Tang X (2016) Accelerating the super-resolution convolutional neural network. In: European conference on computer vision, pp 391–407. Springer

  28. Kim J, Lee JK, Lee KM (2016) Accurate image super-resolution using very deep convolutional networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 1646–1654

  29. Tai Y, Yang J, Liu X (2017) Image super-resolution via deep recursive residual network. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 3147–3155

  30. Ahn N, Kang B, Sohn K-A (2018) Fast, accurate, and lightweight super-resolution with cascading residual network. In: Proceedings of the European Conference on Computer Vision (ECCV), pp 252–268

  31. Haris M, Shakhnarovich G, Ukita N (2018) Deep back-projection networks for super-resolution. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 1664–1673

  32. Lai W-S, Huang J-B, Ahuja N, Yang M-H (2017) Deep laplacian pyramid networks for fast and accurate super-resolution. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 624–632

  33. Hui Z, Gao X, Yang Y, Wang X (2019) Lightweight image super-resolution with information multi-distillation network. In: Proceedings of the 27th Acm international conference on multimedia, pp 2024–2032

  34. Wang X, Yu K, Wu S, Gu J, Liu Y, Dong C, Qiao Y, Change Loy C (2018) Esrgan: enhanced super-resolution generative adversarial networks. In: Proceedings of the European Conference on Computer Vision (ECCV) workshops

  35. Chen S, Huang K, Li B, Xiong D, Jiang H, Claesen L (2020) Adaptive hybrid composition based super-resolution network via fine-grained channel pruning. In: European conference on computer vision, pp 119–135. Springer

  36. Sze V, Chen Y-H, Yang T-J, Emer JS (2017) Efficient processing of deep neural networks: a tutorial and survey. Proc IEEE 105(12):2295–2329

    Article  Google Scholar 

  37. Liang T, Glossner J, Wang L, Shi S, Zhang X (2021) Pruning and quantization for deep neural network acceleration: a survey. Neurocomputing 461:370–403

    Article  Google Scholar 

  38. Liu B, Wang M, Foroosh H, Tappen M, Pensky M (2015) Sparse convolutional neural networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 806–814

  39. Lu L, Xie J, Huang R, Zhang J, Lin W, Liang Y (2019) An efficient hardware accelerator for sparse convolutional neural networks on fpgas. In: 2019 IEEE 27th annual international symposium on Field-Programmable Custom Computing Machines (FCCM), pp 17–25. IEEE

  40. Li H, Kadav A, Durdanovic I, Samet H, Graf HP (2016) Pruning filters for efficient convnets. In: International conference on learning representations

  41. He Y, Zhang X, Sun J (2017) Channel pruning for accelerating very deep neural networks. In: Proceedings of the IEEE international conference on computer vision, pp 1389–1397

  42. Liu Z, Li J, Shen Z, Huang G, Yan S, Zhang C (2017) Learning efficient convolutional networks through network slimming. In: Proceedings of the IEEE international conference on computer vision, pp 2736–2744

  43. Luo J-H, Wu J, Lin W (2017) Thinet: a filter level pruning method for deep neural network compression. In: Proceedings of the IEEE international conference on computer vision, pp 5058–5066

  44. Yu R, Li A, Chen C-F, Lai J-H, Morariu VI, Han X, Gao M, Lin C-Y, Davis LS (2018) Nisp: pruning networks using neuron importance score propagation. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 9194–9203

  45. He Y, Lin J, Liu Z, Wang H, Li L-J, Han S (2018) Amc: automl for model compression and acceleration on mobile devices. In: Proceedings of the European Conference on Computer Vision (ECCV), pp 784–800

  46. Singh P, Verma VK, Rai P, Namboodiri VP (2019) Play and prune: adaptive filter pruning for deep model compression. In: 28th international joint conference on artificial intelligence, IJCAI 2019, pp 3460–3466. International Joint Conferences on Artificial Intelligence

  47. Li Y, Gu S, Zhang K, Gool LV, Timofte R (2020) Dhp: differentiable meta pruning via hypernetworks. In: European conference on computer vision, pp 608–624. Springer

  48. Wang Z, Li C, Wang X (2021) Convolutional neural network pruning with structural redundancy reduction. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 14913–14922

  49. Riera M, Arnau JM, González A (2022) Dnn pruning with principal component analysis and connection importance estimation. J Syst Archit 122:102336

    Article  Google Scholar 

  50. Huang G, Liu S, Van der Maaten L, Weinberger KQ (2018) Condensenet: an efficient densenet using learned group convolutions. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2752–2761

  51. Zhang X, Liu H, Zhu Z, Xu Z (2020) Learning to search efficient densenet with layer-wise pruning. In: 2020 International Joint Conference on Neural Networks (IJCNN), pp 1–8. IEEE

  52. Guo S, Wang Y, Li Q, Yan J (2020) Dmcp: differentiable markov channel pruning for neural networks. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 1539–1547

  53. Agustsson E, Timofte R (2017) Ntire 2017 challenge on single image super-resolution: dataset and study. In: The IEEE Conference On Computer Vision And Pattern Recognition (CVPR) workshops

  54. Lin M, Ji R, Wang Y, Zhang Y, Zhang B, Tian Y, Shao L (2020) Hrank: filter pruning using high-rank feature map. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 1529–1538

  55. Zhao C, Ni B, Zhang J, Zhao Q, Zhang W, Tian Q (2019) Variational convolutional neural network pruning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 2780–2789

  56. Kang M, Han B (2020) Operation-aware soft channel pruning using differentiable masks. In: International conference on machine learning, pp 5122–5131. PMLR

  57. Lin S, Ji R, Yan C, Zhang B, Cao L, Ye Q, Huang F, Doermann D (2019) Towards optimal structured cnn pruning via generative adversarial learning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 2790– 2799

  58. Zhang Y, Wang H, Qin C, Fu Y (2021) Learning efficient image super-resolution networks via structure-regularized pruning. In: International conference on learning representations

Download references

Acknowledgments

This work is supported by the National Natural Science Foundation of China (U21B2004), the Zhejiang Provincial key RD Program of China (2021C01119), and the Core Technology Research Project of Foshan, Guangdong Province, China (1920001000498).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Haoji Hu.

Ethics declarations

Conflict of Interests

On behalf of all authors, the corresponding author states that there is no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hu, X., Fang, H., Zhang, L. et al. Dynamic connection pruning for densely connected convolutional neural networks. Appl Intell 53, 19505–19521 (2023). https://doi.org/10.1007/s10489-023-04513-8

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-023-04513-8

Keywords

Navigation