Article thumbnail

Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression

By Yawei Li, Shuhang Gu, Christoph Mayer, Luc Van Gool and Radu Timofte

Abstract

In this paper, we analyze two popular network compression techniques, i.e. filter pruning and low-rank decomposition, in a unified sense. By simply changing the way the sparsity regularization is enforced, filter pruning and low-rank decomposition can be derived accordingly. This provides another flexible choice for network compression because the techniques complement each other. For example, in popular network architectures with shortcut connections (e.g. ResNet), filter pruning cannot deal with the last convolutional layer in a ResBlock while the low-rank decomposition methods can. In addition, we propose to compress the whole network jointly instead of in a layer-wise manner. Our approach proves its potential as it compares favorably to the state-of-the-art on several benchmarks.Comment: Accepted by CVPR2020. Code is available at https://github.com/ofsoundof/group_sparsit

Topics: Computer Science - Computer Vision and Pattern Recognition
Year: 2020
OAI identifier: oai:arXiv.org:2003.08935

Suggested articles


To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.