6,911 research outputs found

    Depth estimation of inner wall defects by means of infrared thermography

    Get PDF
    There two common methods dealing with interpreting data from infrared thermography: qualitatively and quantitatively. On a certain condition, the first method would be sufficient, but for an accurate interpretation, one should undergo the second one. This report proposes a method to estimate the defect depth quantitatively at an inner wall of petrochemical furnace wall. Finite element method (FEM) is used to model multilayer walls and to simulate temperature distribution due to the existence of the defect. Five informative parameters are proposed for depth estimation purpose. These parameters are the maximum temperature over the defect area (Tmax-def), the average temperature at the right edge of the defect (Tavg-right), the average temperature at the left edge of the defect (Tavg-left), the average temperature at the top edge of the defect (Tavg-top), and the average temperature over the sound area (Tavg-so). Artificial Neural Network (ANN) was trained with these parameters for estimating the defect depth. Two ANN architectures, Multi Layer Perceptron (MLP) and Radial Basis Function (RBF) network were trained for various defect depths. ANNs were used to estimate the controlled and testing data. The result shows that 100% accuracy of depth estimation was achieved for the controlled data. For the testing data, the accuracy was above 90% for the MLP network and above 80% for the RBF network. The results showed that the proposed informative parameters are useful for the estimation of defect depth and it is also clear that ANN can be used for quantitative interpretation of thermography data

    Estimating adaptive setpoint temperatures using weather stations

    Get PDF
    Reducing both the energy consumption and CO 2 emissions of buildings is nowadays one of the main objectives of society. The use of heating and cooling equipment is among the main causes of energy consumption. Therefore, reducing their consumption guarantees such a goal. In this context, the use of adaptive setpoint temperatures allows such energy consumption to be significantly decreased. However, having reliable data from an external temperature probe is not always possible due to various factors. This research studies the estimation of such temperatures without using external temperature probes. For this purpose, a methodology which consists of collecting data from 10 weather stations of Galicia is carried out, and prediction models (multivariable linear regression (MLR) and multilayer perceptron (MLP)) are applied based on two approaches: (1) using both the setpoint temperature and the mean daily external temperature from the previous day; and (2) using the mean daily external temperature from the previous 7 days. Both prediction models provide adequate performances for approach 1, obtaining accurate results between 1 month (MLR) and 5 months (MLP). However, for approach 2, only the MLP obtained accurate results from the 6th month. This research ensures the continuity of using adaptive setpoint temperatures even in case of possible measurement errors or failures of the external temperature probes.Spanish Ministry of Science, Innovation and Universities 00064742/ITC-20133094Spanish Ministry of Economy, Industry and Competitiveness BIA 2017-85657-

    Partially Exchangeable Networks and Architectures for Learning Summary Statistics in Approximate Bayesian Computation

    Get PDF
    We present a novel family of deep neural architectures, named partially exchangeable networks (PENs) that leverage probabilistic symmetries. By design, PENs are invariant to block-switch transformations, which characterize the partial exchangeability properties of conditionally Markovian processes. Moreover, we show that any block-switch invariant function has a PEN-like representation. The DeepSets architecture is a special case of PEN and we can therefore also target fully exchangeable data. We employ PENs to learn summary statistics in approximate Bayesian computation (ABC). When comparing PENs to previous deep learning methods for learning summary statistics, our results are highly competitive, both considering time series and static models. Indeed, PENs provide more reliable posterior samples even when using less training data.Comment: Forthcoming on the Proceedings of ICML 2019. New comparisons with several different networks. We now use the Wasserstein distance to produce comparisons. Code available on GitHub. 16 pages, 5 figures, 21 table
    • …
    corecore