304 research outputs found

    Global Sensitivity Analysis of Stochastic Computer Models with joint metamodels

    Get PDF
    The global sensitivity analysis method, used to quantify the influence of uncertain input variables on the response variability of a numerical model, is applicable to deterministic computer code (for which the same set of input variables gives always the same output value). This paper proposes a global sensitivity analysis methodology for stochastic computer code (having a variability induced by some uncontrollable variables). The framework of the joint modeling of the mean and dispersion of heteroscedastic data is used. To deal with the complexity of computer experiment outputs, non parametric joint models (based on Generalized Additive Models and Gaussian processes) are discussed. The relevance of these new models is analyzed in terms of the obtained variance-based sensitivity indices with two case studies. Results show that the joint modeling approach leads accurate sensitivity index estimations even when clear heteroscedasticity is present

    An efficient methodology for modeling complex computer codes with Gaussian processes

    Get PDF
    Complex computer codes are often too time expensive to be directly used to perform uncertainty propagation studies, global sensitivity analysis or to solve optimization problems. A well known and widely used method to circumvent this inconvenience consists in replacing the complex computer code by a reduced model, called a metamodel, or a response surface that represents the computer code and requires acceptable calculation time. One particular class of metamodels is studied: the Gaussian process model that is characterized by its mean and covariance functions. A specific estimation procedure is developed to adjust a Gaussian process model in complex cases (non linear relations, highly dispersed or discontinuous output, high dimensional input, inadequate sampling designs, ...). The efficiency of this algorithm is compared to the efficiency of other existing algorithms on an analytical test case. The proposed methodology is also illustrated for the case of a complex hydrogeological computer code, simulating radionuclide transport in groundwater

    Global sensitivity analysis for models with spatially dependent outputs

    Get PDF
    International audienceThe global sensitivity analysis of a complex numerical model often calls for the estimation of variance-based importance measures, named Sobol' indices. Metamodel-based techniques have been developed in order to replace the cpu time-expensive computer code with an inexpensive mathematical function, which predicts the computer code output. The common metamodel-based sensitivity analysis methods are well-suited for computer codes with scalar outputs. However, in the environmental domain, as in many areas of application, the numerical model outputs are often spatial maps, which may also vary with time. In this paper, we introduce an innovative method to obtain a spatial map of Sobol' indices with a minimal number of numerical model computations. It is based upon the functional decomposition of the spatial output onto a wavelet basis and the metamodeling of the wavelet coefficients by the Gaussian process. An analytical example is presented to clarify the various steps of our methodology. This technique is then applied to a real hydrogeological case: for each model input variable, a spatial map of Sobol' indices is thus obtained

    The rhizosphere microbiota of the zinc and cadmium hyperaccumulators Arabidopsis halleri and Noccaea caerulescens is highly convergent in Prayon (Belgium)

    Full text link
    The Prayon site is known as a zinc-polluted area where two zinc and cadmium hyperaccumulator plant species currently coexist, although Arabidopsis halleri was introduced more recently than Noccaea caerulescens . While soil microorganisms may influence metal uptake, the microbial community present in the rhizosphere of hyperaccumulators remains poorly known. Plants of both species were sampled with their bulk and rhizosphere soil from different plots of the Prayon site. Soil components (ionome, pH, water composition, temperature) were analyzed, as well as shoot ionome and expression levels of metal transporter genes ( HMA3 , HMA4 , ZIP4 / ZNT1 , ZIP6 , MTP1 ). The taxonomic diversity of the microorganisms in soil samples was then determined by 16S rRNA metabarcoding and compared at the Operational Taxonomy Unit (OTU) level and across different taxonomic levels. Our elemental analyses confirmed that the site is still highly contaminated with zinc and cadmium and that both plant species indeed hyperaccumulate these elements in situ . Although the pollution is overall high, it is heterogenous at the site scale and correlates with the expression of some metal transporter genes. Metabarcoding analyses revealed a decreasing gradient of microbial diversity, with more OTUs discovered in the rhizosphere than in the soil bulk, especially at the bottom of the cores. However, the variability gradient increases with the distance from roots. Using an ad hoc pseudo-taxonomy to bypass the biases caused by a high proportion of unclassified and unknown OTUs, we identified Chloroflexi, Armatimonadetes, Pirellulaceae, Gemmatimonadetes and Chitinophagaceae as the drivers of the differences in the gradient along the cores. In contrast, no significant difference was identified between the rhizosphere composition of A. halleri and N. caerulescens . This suggests that, despite their distinct colonization history in Prayon, the two plant species have now recruited highly convergent microbial communities in the rhizosphere

    Effect of the cooling rate on encapsulant's crystallinity and optical properties, and photovoltaic modules' lifetime

    Get PDF
    Since the renewable energy thrive, performances and lifetime of photovoltaic (PV) modules have been one of the big international concern. The mechanical bonding between the different components and the materials' choice can significantly improve both performances and lifetime of PV modules. The manufacturing process plays also a significant part in the modules lifetime [G. Oreski, B. Ottersböck, A. Omazic, Degradation Processes and Mechanisms of Encapsulants, in Durability and Reliability of Polymers and Other Materials in Photovoltaic Modules (Elsevier, 2019), pp. 135–152]. This work deals with the controlled cooling part of the manufacturing process. The aim is to characterize its influence on an encapsulant properties, and its influences on modules degradation. This work is a part of improving both performances and lifetime of PV modules. First, the work focuses on describing the real temperature seen by a thermoplastic polyolefin encapsulant during the lamination process. A multi-chamber R&D laminator is used and studied in order to better know the industrial equipment. Results show that the cooling process reduces the time to cool down by a factor of ∼5 compared to natural air convection. Secondly, the material's micro-structure is analysed by Differential Scanning Calorimetry (DSC). The impact of the process is quantified. It does have an influence on the encapsulant crystallites' size distribution without modifying the total crystallinity. Thirdly, the impact of the cooling process on optical properties is investigated. Using spectrophotometry and haze-metry optical characterization, coupled with a known light spectrum, the light intensity coming out from the material is analysed. Results show that the cooling process does not have any influence on transmittance nor reflectance. However, a 34% reduction in the haze factor is recorded when using the industrial laminator cooling process. Fourthly, mechanical bond strength between glass and encapsulant is characterized over ageing. Normalized 10 mm width strips are used to estimate the bond strength. It demonstrates that applying pressure during cooling does not influence the bond strength between glass and encapsulant after 1000 h of damp heat ageing. Finally, impact of the cooling process over ageing on PV modules is discussed. Two accelerating ageing methods, 300 Thermal Cycles and 1000 h damp heat, are used to speed up ageing processes. The electrical components of the PV modules are analysed and used to assess the modules' degradation. Modules manufactured with the cooling process are more sensitive to damp heat after 500 h than modules cooled by natural convection. No significant differences were found in thermal cycling ageing

    De Colorando Auro: Medieval colouring techniques researched using modern analytical techniques

    Get PDF
    The visual appearance of gold and gilding can be influenced in many ways, such as by changing the composition of the gold alloy or the nature of the gilded substrate. A less known medieval technique, reported in historical treatises, is the chemical treatment of the gilded surface itself, after application and burnishing of the gilding. We reporte here results regarding the study of the Holy Lady Shrine of Huy (13th c AD Mosan, Belgium) on which a possible artificial colouration of the gildings was detected. This led to many questions regarding applicable conservation-restoration treatments. Here is an overview of the results obtained, along with a discussion of the methodology that was developed to study this colouration process, also highlighting the necessary complementarity between laboratory and synchrotron-based analytical approaches. The 3-steps methodology proposed in this paper is generic for most cultural heritage problems where the application of ancient surface modification recipes is suspected but where the scarcity of the historical samples does not allow a direct study of such samples.L’aspect visuel de l’or et des dorures peut être modifié de diverses manières, notamment en changeant la composition de l’alliage ou la nature du support doré. Une technique médiévale moins connue, évoquée dans les traités anciens, consiste à provoquer une réaction chimique après application et brunissage de la dorure. Les recherches sur la châsse de Notre-Dame de Huy (art mosan du xiiie siècle) ont révélé une possible coloration artificielle des dorures. Cette découverte soulève de nombreuses questions quant aux traitements de conservation-restauration utilisables. Nous présentons ici un exposé sommaire des résultats obtenus ainsi qu’un compte rendu de la méthode mise au point pour étudier ce procédé de coloration, en soulignant la complémentarité indispensable entre les examens de laboratoire et l’analyse par rayonnement synchrotron. Dans les sciences du patrimoine culturel, cette méthode en trois temps concerne la plupart des situations où une recette ancienne semble avoir été employée pour modifier la surface, mais où la rareté des échantillons historiques ne permet pas l’étude directe de sous-échantillons

    Calculations of Sobol indices for the Gaussian process metamodel

    Get PDF
    Global sensitivity analysis of complex numerical models can be performed by calculating variance-based importance measures of the input variables, such as the Sobol indices. However, these techniques, requiring a large number of model evaluations, are often unacceptable for time expensive computer codes. A well known and widely used decision consists in replacing the computer code by a metamodel, predicting the model responses with a negligible computation time and rending straightforward the estimation of Sobol indices. In this paper, we discuss about the Gaussian process model which gives analytical expressions of Sobol indices. Two approaches are studied to compute the Sobol indices: the first based on the predictor of the Gaussian process model and the second based on the global stochastic process model. Comparisons between the two estimates, made on analytical examples, show the superiority of the second approach in terms of convergence and robustness. Moreover, the second approach allows to integrate the modeling error of the Gaussian process model by directly giving some confidence intervals on the Sobol indices. These techniques are finally applied to a real case of hydrogeological modeling

    Variants in the GPR146 Gene Are Associated With a Favorable Cardiometabolic Risk Profile

    Get PDF
    BACKGROUND: In mice, GPR146 (G-protein-coupled receptor 146) deficiency reduces plasma lipids and protects against atherosclerosis. Whether these findings translate to humans is unknown. METHODS: Common and rare genetic variants in the GPR146 gene locus were used as research instruments in the UK-Biobank. The Lifelines, and The Copenhagen-City Heart Study, and a cohort of individuals with familial hypobetalipoproteinemia were used to find and study rare GPR146 variants. RESULTS: In the UK-Biobank, carriers of the common rs2362529-C allele present with lower low-density lipoprotein cholesterol, apo (apolipoprotein) B, high-density lipoprotein cholesterol, apoAI, CRP (C-reactive protein), and plasma liver enzymes compared with noncarriers. Carriers of the common rs1997243-G allele, associated with higher GPR146 expression, present with the exact opposite phenotype. The associations with plasma lipids of the above alleles are allele dose-dependent. Heterozygote carriers of a rare coding variant (p.Pro62Leu; n=2615), predicted to be damaging, show a stronger reductions in the above parameters compared with carriers of the common rs2362529-C allele. The p.Pro62Leu variant is furthermore shown to segregate with low low-density lipoprotein cholesterol in a family with familial hypobetalipoproteinemia. Compared with controls, carriers of the common rs2362529-C allele show a marginally reduced risk of coronary artery disease (P=0.03) concomitant with a small effect size on low-density lipoprotein cholesterol (average decrease of 2.24 mg/dL in homozygotes) of this variant. Finally, mendelian randomization analyses suggest a causal relationship between GPR146 gene expression and plasma lipid and liver enzyme levels. CONCLUSIONS: This study shows that carriers of new genetic GPR146 variants have a beneficial cardiometabolic risk profile, but it remains to be shown whether genetic or pharmaceutical inhibition of GPR146 protects against atherosclerosis in humans

    Global sensitivity analysis of stochastic computer models with joint metamodels

    Get PDF
    The global sensitivity analysis method used to quantify the influence of uncertain input variables on the variability in numerical model responses has already been applied to deterministic computer codes; deterministic means here that the same set of input variables gives always the same output value. This paper proposes a global sensitivity analysis methodology for stochastic computer codes, for which the result of each code run is itself random. The framework of the joint modeling of the mean and dispersion of heteroscedastic data is used. To deal with the complexity of computer experiment outputs, nonparametric joint models are discussed and a new Gaussian process-based joint model is proposed. The relevance of these models is analyzed based upon two case studies. Results show that the joint modeling approach yields accurate sensitivity index estimatiors even when heteroscedasticity is strong
    corecore