15,390 research outputs found
Implicit Filter Sparsification In Convolutional Neural Networks
We show implicit filter level sparsity manifests in convolutional neural networks (CNNs) which employ Batch Normalization and ReLU activation, and are trained with adaptive gradient descent techniques and L2 regularization or weight decay. Through an extensive empirical study (Mehta et al., 2019) we hypothesize the mechanism behind the sparsification process, and find surprising links to certain filter sparsification heuristics proposed in literature. Emergence of, and the subsequent pruning of selective features is observed to be one of the contributing mechanisms, leading to feature sparsity at par or better than certain explicit sparsification / pruning approaches. In this workshop article we summarize our findings, and point out corollaries of selective-featurepenalization which could also be employed as heuristics for filter prunin
Implicit Filter Sparsification In Convolutional Neural Networks
We show implicit filter level sparsity manifests in convolutional neural
networks (CNNs) which employ Batch Normalization and ReLU activation, and are
trained with adaptive gradient descent techniques and L2 regularization or
weight decay. Through an extensive empirical study (Mehta et al., 2019) we
hypothesize the mechanism behind the sparsification process, and find
surprising links to certain filter sparsification heuristics proposed in
literature. Emergence of, and the subsequent pruning of selective features is
observed to be one of the contributing mechanisms, leading to feature sparsity
at par or better than certain explicit sparsification / pruning approaches. In
this workshop article we summarize our findings, and point out corollaries of
selective-featurepenalization which could also be employed as heuristics for
filter pruningComment: ODML-CDNNR 2019 (ICML'19 workshop) extended abstract of the CVPR 2019
paper "On Implicit Filter Level Sparsity in Convolutional Neural Networks,
Mehta et al." (arXiv:1811.12495
- …