28,792 research outputs found

    Comparison of data-driven uncertainty quantification methods for a carbon dioxide storage benchmark scenario

    Full text link
    A variety of methods is available to quantify uncertainties arising with\-in the modeling of flow and transport in carbon dioxide storage, but there is a lack of thorough comparisons. Usually, raw data from such storage sites can hardly be described by theoretical statistical distributions since only very limited data is available. Hence, exact information on distribution shapes for all uncertain parameters is very rare in realistic applications. We discuss and compare four different methods tested for data-driven uncertainty quantification based on a benchmark scenario of carbon dioxide storage. In the benchmark, for which we provide data and code, carbon dioxide is injected into a saline aquifer modeled by the nonlinear capillarity-free fractional flow formulation for two incompressible fluid phases, namely carbon dioxide and brine. To cover different aspects of uncertainty quantification, we incorporate various sources of uncertainty such as uncertainty of boundary conditions, of conceptual model definitions and of material properties. We consider recent versions of the following non-intrusive and intrusive uncertainty quantification methods: arbitary polynomial chaos, spatially adaptive sparse grids, kernel-based greedy interpolation and hybrid stochastic Galerkin. The performance of each approach is demonstrated assessing expectation value and standard deviation of the carbon dioxide saturation against a reference statistic based on Monte Carlo sampling. We compare the convergence of all methods reporting on accuracy with respect to the number of model runs and resolution. Finally we offer suggestions about the methods' advantages and disadvantages that can guide the modeler for uncertainty quantification in carbon dioxide storage and beyond

    Scalable Approach to Uncertainty Quantification and Robust Design of Interconnected Dynamical Systems

    Full text link
    Development of robust dynamical systems and networks such as autonomous aircraft systems capable of accomplishing complex missions faces challenges due to the dynamically evolving uncertainties coming from model uncertainties, necessity to operate in a hostile cluttered urban environment, and the distributed and dynamic nature of the communication and computation resources. Model-based robust design is difficult because of the complexity of the hybrid dynamic models including continuous vehicle dynamics, the discrete models of computations and communications, and the size of the problem. We will overview recent advances in methodology and tools to model, analyze, and design robust autonomous aerospace systems operating in uncertain environment, with stress on efficient uncertainty quantification and robust design using the case studies of the mission including model-based target tracking and search, and trajectory planning in uncertain urban environment. To show that the methodology is generally applicable to uncertain dynamical systems, we will also show examples of application of the new methods to efficient uncertainty quantification of energy usage in buildings, and stability assessment of interconnected power networks

    Polynomial-Chaos-based Kriging

    Full text link
    Computer simulation has become the standard tool in many engineering fields for designing and optimizing systems, as well as for assessing their reliability. To cope with demanding analysis such as optimization and reliability, surrogate models (a.k.a meta-models) have been increasingly investigated in the last decade. Polynomial Chaos Expansions (PCE) and Kriging are two popular non-intrusive meta-modelling techniques. PCE surrogates the computational model with a series of orthonormal polynomials in the input variables where polynomials are chosen in coherency with the probability distributions of those input variables. On the other hand, Kriging assumes that the computer model behaves as a realization of a Gaussian random process whose parameters are estimated from the available computer runs, i.e. input vectors and response values. These two techniques have been developed more or less in parallel so far with little interaction between the researchers in the two fields. In this paper, PC-Kriging is derived as a new non-intrusive meta-modeling approach combining PCE and Kriging. A sparse set of orthonormal polynomials (PCE) approximates the global behavior of the computational model whereas Kriging manages the local variability of the model output. An adaptive algorithm similar to the least angle regression algorithm determines the optimal sparse set of polynomials. PC-Kriging is validated on various benchmark analytical functions which are easy to sample for reference results. From the numerical investigations it is concluded that PC-Kriging performs better than or at least as good as the two distinct meta-modeling techniques. A larger gain in accuracy is obtained when the experimental design has a limited size, which is an asset when dealing with demanding computational models

    Public preferences for policies promoting a healthy diet:a discrete choice experiment

    Get PDF
    BACKGROUND: Worldwide obesity rates have nearly tripled over the past five decades. So far, policies to promote a healthier diet have been less intrusive than those to reduce tobacco and alcohol consumption. Not much is known about public support for policies that aim to promote a healthy diet. In this study, a discrete choice experiment (DCE) was used to elicit stated preferences for policies varying in intrusiveness among a representative sample of the public of The Netherlands. METHODS: The choice tasks presented respondents a hypothetical scenario of two policy packages, each comprising a mix of seven potential policies that differed in level of intrusiveness. We estimated mixed logit models (MXL) to estimate respondents’ preferences for these policies and performed latent class analyses to identify heterogeneity in preferences. RESULTS: The MXL model showed that positive financial incentives like subsidies for vegetables and fruit yielded most utility. A tax of 50% on sugary drinks was associated with disutility while a tax of 20% was associated with positive utility compared to no tax at all. We identified three subgroups with distinct preferences for the seven policies to promote a healthy diet, which were characterized as being “against”, “mixed” and “pro” policies to promote a healthy diet. CONCLUSION: Preferences for policies promoting a healthy diet vary considerably in the Dutch population, particularly in relation to more intrusive policies. This makes selection and implementation of a policy package that has wide public support challenging. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s10198-022-01554-7

    New single-ended objective measure for non-intrusive speech quality evaluation

    Get PDF
    peer-reviewedThis article proposes a new output-based method for non-intrusive assessment of speech quality of voice communication systems and evaluates its performance. The method requires access to the processed (degraded) speech only, and is based on measuring perception-motivated objective auditory distances between the voiced parts of the output speech to appropriately matching references extracted from a pre-formulated codebook. The codebook is formed by optimally clustering a large number of parametric speech vectors extracted from a database of clean speech records. The auditory distances are then mapped into objective Mean Opinion listening quality scores. An efficient data-mining tool known as the self-organizing map (SOM) achieves the required clustering and mapping/reference matching processes. In order to obtain a perception-based, speaker-independent parametric representation of the speech, three domain transformation techniques have been investigated. The first technique is based on a perceptual linear prediction (PLP) model, the second utilises a bark spectrum (BS) analysis and the third utilises mel-frequency cepstrum coefficients (MFCC). Reported evaluation results show that the proposed method provides high correlation with subjective listening quality scores, yielding accuracy similar to that of the ITU-T P.563 while maintaining a relatively low computational complexity. Results also demonstrate that the method outperforms the PESQ in a number of distortion conditions, such as those of speech degraded by channel impairments.acceptedpeer-reviewe

    Reliability assessment of cutting tool life based on surrogate approximation methods

    Get PDF
    A novel reliability estimation approach to the cutting tools based on advanced approximation methods is proposed. Methods such as the stochastic response surface and surrogate modeling are tested, starting from a few sample points obtained through fundamental experiments and extending them to models able to estimate the tool wear as a function of the key process parameters. Subsequently, different reliability analysis methods are employed such as Monte Carlo simulations and first- and second-order reliability methods. In the present study, these reliability analysis methods are assessed for estimating the reliability of cutting tools. The results show that the proposed method is an efficient method for assessing the reliability of the cutting tool based on the minimum number of experimental results. Experimental verification for the case of high-speed turning confirms the findings of the present study for cutting tools under flank wear
    • …
    corecore