Abstract
In this paper, we introduce a new channel pruning method to accelerate very deep convolutional neural networks. Given a trained CNN model, we propose an iterative two-step algorithm to effectively prune each layer, by a LASSO regression based channel selection and least square reconstruction. We further generalize this algorithm to multi-layer and multi-branch cases. Our method reduces the accumulated error and enhance the compatibility with various architectures. Our pruned VGG-16 achieves the state-of-the-art results by 5× speed-up along with only 0.3% increase of error. More importantly, our method is able to accelerate modern networks like ResNet, Xception and suffers only 1.4%, 1.0% accuracy loss under 2× speedup respectively, which is significant.
Keywords
Affiliated Institutions
Related Publications
Accelerating Very Deep Convolutional Networks for Classification and Detection
This paper aims to accelerate the test-time computation of convolutional neural networks (CNNs), especially very deep CNNs [1] that have substantially impacted the computer visi...
Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration
Previous works utilized "smaller-norm-less-important" criterion to prune filters with smaller norm values in a convolutional neural network. In this paper, we analyze this norm-...
Is Second-Order Information Helpful for Large-Scale Visual Recognition?
By stacking layers of convolution and nonlinearity, convolutional networks (ConvNets) effectively learn from lowlevel to high-level features and discriminative representations. ...
Efficient and accurate approximations of nonlinear convolutional networks
This paper aims to accelerate the test-time computation of deep convolutional neural networks (CNNs). Unlike existing methods that are designed for approximating linear filters ...
ResNeSt: Split-Attention Networks
The ability to learn richer network representations generally boosts the performance of deep learning models. To improve representation-learning in convolutional neural networks...
Publication Info
- Year
- 2017
- Type
- preprint
- Pages
- 1398-1406
- Citations
- 2474
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1109/iccv.2017.155