69 research outputs found

    Static analysis of continuous beam with numerical method (FEM)

    Get PDF
    Finite element method is a method of analysis and simulation of current real phenomena. This paper focuses on this method, applied through finite element analysis program Matlab, presenting a structural analysis application useful in the field of forest, mechanical and structural engineering. Program designed by the authors using the finite element tool engineer put in hand work necessary to optimize the design, with positive effects on the complete analysis of stress and tensions in continuous beams

    One-Cycle Pruning: Pruning ConvNets Under a Tight Training Budget

    Full text link
    Introducing sparsity in a neural network has been an efficient way to reduce its complexity while keeping its performance almost intact. Most of the time, sparsity is introduced using a three-stage pipeline: 1) train the model to convergence, 2) prune the model according to some criterion, 3) fine-tune the pruned model to recover performance. The last two steps are often performed iteratively, leading to reasonable results but also to a time-consuming and complex process. In our work, we propose to get rid of the first step of the pipeline and to combine the two other steps in a single pruning-training cycle, allowing the model to jointly learn for the optimal weights while being pruned. We do this by introducing a novel pruning schedule, named One-Cycle Pruning, which starts pruning from the beginning of the training, and until its very end. Adopting such a schedule not only leads to better performing pruned models but also drastically reduces the training budget required to prune a model. Experiments are conducted on a variety of architectures (VGG-16 and ResNet-18) and datasets (CIFAR-10, CIFAR-100 and Caltech-101), and for relatively high sparsity values (80%, 90%, 95% of weights removed). Our results show that One-Cycle Pruning consistently outperforms commonly used pruning schedules such as One-Shot Pruning, Iterative Pruning and Automated Gradual Pruning, on a fixed training budget.Comment: Accepted at Sparsity in Neural Networks (SNN 2021
    • …
    corecore