Network pruning via resource reallocation

Yuenan Hou*, Zheng Ma, Chunxiao Liu, Zhe Wang, Chen Change Loy

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

19 Citations (Scopus)

Abstract

Channel pruning is broadly recognized as an effective approach to obtain a small compact model through eliminating unimportant channels from a large cumbersome network. Contemporary methods typically perform iterative pruning procedure from the original over-parameterized model, which is both tedious and expensive especially when the pruning is aggressive. In this paper, we propose a simple yet effective channel pruning technique, termed network Pruning via rEsource rEalLocation (PEEL), to quickly produce a desired slim model with negligible cost. Specifically, PEEL first constructs a predefined backbone and then conducts resource reallocation on it to shift parameters from less informative layers to more important layers in one round, thus amplifying the positive effect of these informative layers. To demonstrate the effectiveness of PEEL, we perform extensive experiments on ImageNet with ResNet-18, ResNet-50, MobileNetV2, MobileNetV3-small and EfficientNet-B0. Experimental results show that structures uncovered by PEEL exhibit competitive performance with state-of-the-art pruning algorithms under various pruning settings. Encouraging results are also observed when applying PEEL to compress the semantic segmentation model. Our code is available at https://github.com/cardwing/Codes-for-PEEL.

Original languageEnglish
Article number109886
JournalPattern Recognition
Volume145
DOIs
Publication statusPublished - Jan 2024
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2023 Elsevier Ltd

ASJC Scopus Subject Areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Keywords

  • Network pruning
  • Resource reallocation
  • Searching cost

Cite this