3,655 research outputs found

    Online Filter Clustering and Pruning for Efficient Convnets

    Full text link
    Pruning filters is an effective method for accelerating deep neural networks (DNNs), but most existing approaches prune filters on a pre-trained network directly which limits in acceleration. Although each filter has its own effect in DNNs, but if two filters are the same with each other, we could prune one safely. In this paper, we add an extra cluster loss term in the loss function which can force filters in each cluster to be similar online. After training, we keep one filter in each cluster and prune others and fine-tune the pruned network to compensate for the loss. Particularly, the clusters in every layer can be defined firstly which is effective for pruning DNNs within residual blocks. Extensive experiments on CIFAR10 and CIFAR100 benchmarks demonstrate the competitive performance of our proposed filter pruning method.Comment: 5 pages, 4 figure

    Deep Learning as a Parton Shower

    Get PDF
    We make the connection between certain deep learning architectures and the renormalisation group explicit in the context of QCD by using a deep learning network to construct a toy parton shower model. The model aims to describe proton-proton collisions at the Large Hadron Collider. A convolutional autoencoder learns a set of kernels that efficiently encode the behaviour of fully showered QCD collision events. The network is structured recursively so as to ensure self-similarity, and the number of trained network parameters is low. Randomness is introduced via a novel custom masking layer, which also preserves existing parton splittings by using layer-skipping connections. By applying a shower merging procedure, the network can be evaluated on unshowered events produced by a matrix element calculation. The trained network behaves as a parton shower that qualitatively reproduces jet-based observables.Comment: 26 pages, 13 figure
    • …
    corecore