260 research outputs found

    Un marco de aprendizaje mutuo para redes podadas y cuantificadas

    Get PDF
    Model compression is an important topic in deep learning research. It can be mainly divided into two directions: model pruning and model quantization. However, both methods will more or less affect the original accuracy of the model. In this paper, we propose a mutual learning framework for pruned and quantized networks. We regard the pruned network and the quantized network as two sets of features that are not parallel. The purpose of our mutual learning framework is to better integrate the two sets of features and achieve complementary advantages, which we call feature augmentation. To verify the effectiveness of our framework, we select a pairwise combination of 3 state-of-the-art pruning algorithms and 3 state-of-theart quantization algorithms. Extensive experiments on CIFAR-10, CIFAR-100 and Tiny-imagenet show the benefits of our framework: through the mutual learning of the two networks, we obtain a pruned network and a quantization network with higher accuracy than traditional approaches.La compresión de modelos es un tema importante en la investigación del aprendizaje profundo. Se puede dividir principalmente en dos direcciones: poda de modelos y cuantización de modelos. Sin embargo, ambos métodos afectarán más o menos la precisión original del modelo. En este artículo, proponemos un marco de aprendizaje mutuo para redes podadas y cuantificadas. Consideramos la red podada y la red quantized como dos conjuntos de características que no son paralelas. El propósito de nuestro marco de aprendizaje mutuo es integrar mejor los dos conjuntos de funciones y lograr ventajas complementarias, lo que llamamos aumento de funciones. Para verificar la efectividad de nuestro marco, seleccionamos una combinación por pares de 3 algoritmos de poda de última generación y 3 algoritmos de cuantificación de última generación. Extensos experimentos en CIFAR- 10, CIFAR-100 y Tiny-imagenet muestran los beneficios de nuestro marco: a través del aprendizaje mutuo de las dos redes, obtenemos una red pruned y una red de cuantificación con mayor precisión que los enfoques tradicionales.Facultad de Informátic

    Microwave Power Measurements: Standards and Transfer Techniques

    Get PDF
    In this chapter, precision power measurement, which is probably the most important area in RF and microwave metrology, will be discussed. Firstly, the background of RF and microwave power measurements and standards will be introduced. Secondly, the working principle of primary power standard (i.e., microcalorimeter) will be described, followed by the discussions of direct comparison transfer technique. Finally, there will be some discussions about the performance evaluation and uncertainty estimation for microwave power measurements

    CT Experiments and Image Processing for the Water-Oil Displacement at Pore Scale

    Get PDF
    AbstractWe established a CT experimental method for the study of the water-oil displacement at pore scale. The microscopic core model made up of reservoir coring materials could truthfully reflect the surface property and pore structure of reservoir rocks. We scanned the core model at different water flooding stages using SkyScan1174v2 CT scanner, and high resolution images were obtained. The present paper adopted a new image segmentation method, which depends on the discriminatory analysis constrained by the measured porosity and oil saturation. This new method improved the accuracy of image segmentation. We utilized the new algorithm to carry out the segmentation of pores and residual oil from the scanning images. The segmentation results were in agreement with those measured from the core experiments

    Un marco de aprendizaje mutuo para redes podadas y cuantificadas

    Get PDF
    Model compression is an important topic in deep learning research. It can be mainly divided into two directions: model pruning and model quantization. However, both methods will more or less affect the original accuracy of the model. In this paper, we propose a mutual learning framework for pruned and quantized networks. We regard the pruned network and the quantized network as two sets of features that are not parallel. The purpose of our mutual learning framework is to better integrate the two sets of features and achieve complementary advantages, which we call feature augmentation. To verify the effectiveness of our framework, we select a pairwise combination of 3 state-of-the-art pruning algorithms and 3 state-of-theart quantization algorithms. Extensive experiments on CIFAR-10, CIFAR-100 and Tiny-imagenet show the benefits of our framework: through the mutual learning of the two networks, we obtain a pruned network and a quantization network with higher accuracy than traditional approaches.La compresión de modelos es un tema importante en la investigación del aprendizaje profundo. Se puede dividir principalmente en dos direcciones: poda de modelos y cuantización de modelos. Sin embargo, ambos métodos afectarán más o menos la precisión original del modelo. En este artículo, proponemos un marco de aprendizaje mutuo para redes podadas y cuantificadas. Consideramos la red podada y la red quantized como dos conjuntos de características que no son paralelas. El propósito de nuestro marco de aprendizaje mutuo es integrar mejor los dos conjuntos de funciones y lograr ventajas complementarias, lo que llamamos aumento de funciones. Para verificar la efectividad de nuestro marco, seleccionamos una combinación por pares de 3 algoritmos de poda de última generación y 3 algoritmos de cuantificación de última generación. Extensos experimentos en CIFAR- 10, CIFAR-100 y Tiny-imagenet muestran los beneficios de nuestro marco: a través del aprendizaje mutuo de las dos redes, obtenemos una red pruned y una red de cuantificación con mayor precisión que los enfoques tradicionales.Facultad de Informátic
    corecore