1,033 research outputs found

    Stochastic collocation approach with adaptive mesh refinement for parametric uncertainty analysis

    Full text link
    Presence of a high-dimensional stochastic parameter space with discontinuities poses major computational challenges in analyzing and quantifying the effects of the uncertainties in a physical system. In this paper, we propose a stochastic collocation method with adaptive mesh refinement (SCAMR) to deal with high dimensional stochastic systems with discontinuities. Specifically, the proposed approach uses generalized polynomial chaos (gPC) expansion with Legendre polynomial basis and solves for the gPC coefficients using the least squares method. It also implements an adaptive mesh (element) refinement strategy which checks for abrupt variations in the output based on the second order gPC approximation error to track discontinuities or non-smoothness. In addition, the proposed method involves a criterion for checking possible dimensionality reduction and consequently, the decomposition of the full-dimensional problem to a number of lower-dimensional subproblems. Specifically, this criterion checks all the existing interactions between input dimensions of a specific problem based on the high-dimensional model representation (HDMR) method, and therefore automatically provides the subproblems which only involve interacting dimensions. The efficiency of the approach is demonstrated using both smooth and non-smooth function examples with input dimensions up to 300, and the approach is compared against other existing algorithms

    A novel model for uncertainty propagation analysis applied for human thermal comfort evaluation

    Get PDF
    The comfort sensation is mainly affected by six variables: air temperature, mean radiant temperature, air velocity, relative humidity, personal metabolism and clothing insulation. These are characterized by different mean values and distributions. To analyze the uncertainty propagation three numerical models are used: the Fully Monte Carlo Simulation MCSs, the Monte Carlo Simulation Trials MCSt, and a novel model named "Adaptive Derivative based High Dimensional Model Representation" (AD-HDMR). In the paper these three different methods are applied to the thermal comfort evaluation, through the PMV Index, they are analyzed and their efficiency was verified in terms of computational time. To allow a revision of this index, the effect of the different variables was then analyzed

    The Theory of Scanning Quantum Dot Microscopy

    Full text link
    Electrostatic forces are among the most common interactions in nature and omnipresent at the nanoscale. Scanning probe methods represent a formidable approach to study these interactions locally. The lateral resolution of such images is, however, often limited as they are based on measuring the force (gradient) due to the entire tip interacting with the entire surface. Recently, we developed scanning quantum dot microscopy (SQDM), a new technique for the imaging and quantification of surface potentials which is based on the gating of a nanometer-size tip-attached quantum dot by the local surface potential and the detection of charge state changes via non-contact atomic force microscopy. Here, we present a rigorous formalism in the framework of which SQDM can be understood and interpreted quantitatively. In particular, we present a general theory of SQDM based on the classical boundary value problem of electrostatics, which is applicable to the full range of sample properties (conductive vs insulating, nanostructured vs homogeneously covered). We elaborate the general theory into a formalism suited for the quantitative analysis of images of nanostructured but predominantly flat and conductive samples

    Approximate likelihood inference in generalized linear latent variable models based on integral dimension reduction

    Full text link
    Latent variable models represent a useful tool for the analysis of complex data when the constructs of interest are not observable. A problem related to these models is that the integrals involved in the likelihood function cannot be solved analytically. We propose a computational approach, referred to as Dimension Reduction Method (DRM), that consists of a dimension reduction of the multidimensional integral that makes the computation feasible in situations in which the quadrature based methods are not applicable. We discuss the advantages of DRM compared with other existing approximation procedures in terms of both computational feasibility of the method and asymptotic properties of the resulting estimators.Comment: 28 pages, 3 figures, 7 table

    Characterization of process-oriented hydrologic model behavior with temporal sensitivity analysis for flash floods in Mediterranean catchments

    Get PDF
    This paper presents a detailed analysis of 10 flash flood events in the Mediterranean region using the distributed hydrological model MARINE. Characterizing catchment response during flash flood events may provide new and valuable insight into the dynamics involved for extreme catchment response and their dependency on physiographic properties and flood severity. The main objective of this study is to analyze flash-flood-dedicated hydrologic model sensitivity with a new approach in hydrology, allowing model outputs variance decomposition for temporal patterns of parameter sensitivity analysis. Such approaches enable ranking of uncertainty sources for nonlinear and nonmonotonic mappings with a low computational cost. Hydrologic model and sensitivity analysis are used as learning tools on a large flash flood dataset. With Nash performances above 0.73 on average for this extended set of 10 validation events, the five sensitive parameters of MARINE process-oriented distributed model are analyzed. This contribution shows that soil depth explains more than 80% of model output variance when most hydrographs are peaking. Moreover, the lateral subsurface transfer is responsible for 80% of model variance for some catchment-flood events’ hydrographs during slow-declining limbs. The unexplained variance of model output representing interactions between parameters reveals to be very low during modeled flood peaks and informs that model parsimonious parameterization is appropriate to tackle the problem of flash floods. Interactions observed after model initialization or rainfall intensity peaks incite to improve water partition representation between flow components and initialization itself. This paper gives a practical framework for application of this method to other models, landscapes and climatic conditions, potentially helping to improve processes understanding and representation
    corecore