21 research outputs found

    Contaminant source localization via Bayesian global optimization

    Get PDF
    Contaminant source localization problems require efficient and robust methods that can account for geological heterogeneities and accommodate relatively small data sets of noisy observations. As realism commands hi-fidelity simulations, computation costs call for global optimization algorithms under parsimonious evaluation budgets. Bayesian optimization approaches are well adapted to such settings as they allow the exploration of parameter spaces in a principled way so as to iteratively locate the point(s) of global optimum while maintaining an approximation of the objective function with an instrumental quantification of prediction uncertainty. Here, we adapt a Bayesian optimization approach to localize a contaminant source in a discretized spatial domain. We thus demonstrate the potential of such a method for hydrogeological applications and also provide test cases for the optimization community. The localization problem is illustrated for cases where the geology is assumed to be perfectly known. Two 2-D synthetic cases that display sharp hydraulic conductivity contrasts and specific connectivity patterns are investigated. These cases generate highly nonlinear objective functions that present multiple local minima. A derivative-free global optimization algorithm relying on a Gaussian process model and on the expected improvement criterion is used to efficiently localize the point of minimum of the objective functions, which corresponds to the contaminant source location. Even though concentration measurements contain a significant level of proportional noise, the algorithm efficiently localizes the contaminant source location. The variations of the objective function are essentially driven by the geology, followed by the design of the monitoring well network. The data and scripts used to generate objective functions are shared to favor reproducible research. This contribution is important because the functions present multiple local minima and are inspired from a practical field application. Sharing these complex objective functions provides a source of test cases for global optimization benchmarks and should help with designing new and efficient methods to solve this type of problem.</p

    Global sensitivity analysis of stochastic computer models with joint metamodels

    Get PDF
    The global sensitivity analysis method used to quantify the influence of uncertain input variables on the variability in numerical model responses has already been applied to deterministic computer codes; deterministic means here that the same set of input variables gives always the same output value. This paper proposes a global sensitivity analysis methodology for stochastic computer codes, for which the result of each code run is itself random. The framework of the joint modeling of the mean and dispersion of heteroscedastic data is used. To deal with the complexity of computer experiment outputs, nonparametric joint models are discussed and a new Gaussian process-based joint model is proposed. The relevance of these models is analyzed based upon two case studies. Results show that the joint modeling approach yields accurate sensitivity index estimatiors even when heteroscedasticity is strong

    Cokriging for multivariate Hilbert space valued random fields: application to multi-fidelity computer code emulation

    Get PDF
    In this paper we propose Universal trace co-kriging, a novel methodology for interpolation of multivariate Hilbert space valued functional data. Such data commonly arises in multi-fidelity numerical modeling of the subsurface and it is a part of many modern uncertainty quantification studies. Besides theoretical developments we also present methodological evaluation and comparisons with the recently published projection based approach by Bohorquez et al. (Stoch Environ Res Risk Assess 31(1):53–70, 2016. https://doi.org/10.1007/s00477-016-1266-y). Our evaluations and analyses were performed on synthetic (oil reservoir) and real field (uranium contamination) subsurface uncertainty quantification case studies. Monte Carlo analyses were conducted to draw important conclusions and to provide practical guidelines for all future practitioners

    Optimization with LES -- algorithms for dealing with sampling error of turbulence statistics

    No full text

    A proposal of location aware shopping assistance using memory-based resampling

    No full text
    The range of memory specifications of location aware shopping assistance poses difficulties for the developer (in terms of increased time and effort) when it comes to developing a resampling algorithm for mobile devices. Thus, a new resampling algorithm is required with a flexible capacity that would cater for a range of computing device memory devices specifications. This paper develops a memory based resampling in standard particle filter. The memory resampling is capable to read memory specifications of mobile devices before determines the most suitable resampling functions. The authors aim to extend this work in future by implementing their proposed method in a number of different emerging applications (in example, medical applications and real time locator systems)

    User preferences in Bayesian multi-objective optimization: the expected weighted hypervolume improvement criterion

    No full text
    To be published in the proceedings of LOD 2018 – The Fourth International Conference on Machine Learning, Optimization, and Data Science – September 13-16, 2018 – Volterra, Tuscany, ItalyIn this article, we present a framework for taking into account user preferences in multi-objective Bayesian optimization in the case where the objectives are expensive-to-evaluate black-box functions. A novel expected improvement criterion to be used within Bayesian optimization algorithms is introduced. This criterion, which we call the expected weighted hypervolume improvement (EWHI) criterion, is a generalization of the popular expected hypervolume improvement to the case where the hypervolume of the dominated region is defined using an absolutely continuous measure instead of the Lebesgue measure. The EWHI criterion takes the form of an integral for which no closed form expression exists in the general case. To deal with its computation, we propose an importance sampling approximation method. A sampling density that is optimal for the computation of the EWHI for a predefined set of points is crafted and a sequential Monte-Carlo (SMC) approach is used to obtain a sample approximately distributed from this density. The ability of the criterion to produce optimization strategies oriented by user preferences is demonstrated on a simple bi-objective test problem in the cases of a preference for one objective and of a preference for certain regions of the Pareto front

    Stochastic Intrinsic Kriging for Simulation Metamodelling

    No full text
    We derive intrinsic Kriging, using Matherons intrinsic random functions which eliminate the trend in classic Kriging. We formulate this intrinsic Kriging as a metamodel in deterministic and random simulation models. For random simulation we derive an experimental design that also specifies the number of replications that varies with the input combinations. We compare intrinsic Kriging and classic Kriging in several numerical experiments with deterministic and random simulations. These experiments suggest that intrinsic Kriging gives more accurate metamodel, in most experiments
    corecore