1 research outputs found
Enhancing the Regularization Effect of Weight Pruning in Artificial Neural Networks
Artificial neural networks (ANNs) may not be worth their computational/memory
costs when used in mobile phones or embedded devices. Parameter-pruning
algorithms combat these costs, with some algorithms capable of removing over
90% of an ANN's weights without harming the ANN's performance. Removing weights
from an ANN is a form of regularization, but existing pruning algorithms do not
significantly improve generalization error. We show that pruning ANNs can
improve generalization if pruning targets large weights instead of small
weights. Applying our pruning algorithm to an ANN leads to a higher image
classification accuracy on CIFAR-10 data than applying the popular regularizer
dropout. The pruning couples this higher accuracy with an 85% reduction of the
ANN's parameter count