339 research outputs found

    Quantification d’incertitude sur fronts de Pareto et stratĂ©gies pour l’optimisation bayĂ©sienne en grande dimension, avec applications en conception automobile

    Get PDF
    This dissertation deals with optimizing expensive or time-consuming black-box functionsto obtain the set of all optimal compromise solutions, i.e. the Pareto front. In automotivedesign, the evaluation budget is severely limited by numerical simulation times of the considered physical phenomena. In this context, it is common to resort to “metamodels” (models of models) of the numerical simulators, especially using Gaussian processes. They enable adding sequentially new observations while balancing local search and exploration. Complementing existing multi-objective Expected Improvement criteria, we propose to estimate the position of the whole Pareto front along with a quantification of the associated uncertainty, from conditional simulations of Gaussian processes. A second contribution addresses this problem from a different angle, using copulas to model the multi-variate cumulative distribution function. To cope with a possibly high number of variables, we adopt the REMBO algorithm. From a randomly selected direction, defined by a matrix, it allows a fast optimization when only a few number of variables are actually influential, but unknown. Several improvements are proposed, such as a dedicated covariance kernel, a selection procedure for the low dimensional domain and of the random directions, as well as an extension to the multi-objective setup. Finally, an industrial application in car crash-worthiness demonstrates significant benefits in terms of performance and number of simulations required. It has also been used to test the R package GPareto developed during this thesis.Cette thĂšse traite de l’optimisation multiobjectif de fonctions coĂ»teuses, aboutissant Ă  laconstruction d’un front de Pareto reprĂ©sentant l’ensemble des compromis optimaux. En conception automobile, le budget d’évaluations est fortement limitĂ© par les temps de simulation numĂ©rique des phĂ©nomĂšnes physiques considĂ©rĂ©s. Dans ce contexte, il est courant d’avoir recours Ă  des « mĂ©tamodĂšles » (ou modĂšles de modĂšles) des simulateurs numĂ©riques, en se basant notamment sur des processus gaussiens. Ils permettent d’ajouter sĂ©quentiellement des observations en conciliant recherche locale et exploration. En complĂ©ment des critĂšres d’optimisation existants tels que des versions multiobjectifs du critĂšre d’amĂ©lioration espĂ©rĂ©e, nous proposons d’estimer la position de l’ensemble du front de Pareto avec une quantification de l’incertitude associĂ©e, Ă  partir de simulations conditionnelles de processus gaussiens. Une deuxiĂšme contribution reprend ce problĂšme Ă  partir de copules. Pour pouvoir traiter le cas d’un grand nombre de variables d’entrĂ©es, nous nous basons sur l’algorithme REMBO. Par un tirage alĂ©atoire directionnel, dĂ©fini par une matrice, il permet de trouver un optimum rapidement lorsque seules quelques variables sont rĂ©ellement influentes (mais inconnues). Plusieurs amĂ©liorations sont proposĂ©es, elles comprennent un noyau de covariance dĂ©diĂ©, une sĂ©lection du domaine de petite dimension et des directions alĂ©atoires mais aussi l’extension au casmultiobjectif. Enfin, un cas d’application industriel en crash a permis d’obtenir des gainssignificatifs en performance et en nombre de calculs requis, ainsi que de tester le package R GPareto dĂ©veloppĂ© dans le cadre de cette thĂšse

    Improving Automated Driving through Planning with Human Internal States

    Full text link
    This work examines the hypothesis that partially observable Markov decision process (POMDP) planning with human driver internal states can significantly improve both safety and efficiency in autonomous freeway driving. We evaluate this hypothesis in a simulated scenario where an autonomous car must safely perform three lane changes in rapid succession. Approximate POMDP solutions are obtained through the partially observable Monte Carlo planning with observation widening (POMCPOW) algorithm. This approach outperforms over-confident and conservative MDP baselines and matches or outperforms QMDP. Relative to the MDP baselines, POMCPOW typically cuts the rate of unsafe situations in half or increases the success rate by 50%.Comment: Preprint before submission to IEEE Transactions on Intelligent Transportation Systems. arXiv admin note: text overlap with arXiv:1702.0085

    Quantifying uncertainties on excursion sets under a Gaussian random field prior

    Get PDF
    We focus on the problem of estimating and quantifying uncertainties on the excursion set of a function under a limited evaluation budget. We adopt a Bayesian approach where the objective function is assumed to be a realization of a Gaussian random field. In this setting, the posterior distribution on the objective function gives rise to a posterior distribution on excursion sets. Several approaches exist to summarize the distribution of such sets based on random closed set theory. While the recently proposed Vorob'ev approach exploits analytical formulae, further notions of variability require Monte Carlo estimators relying on Gaussian random field conditional simulations. In the present work we propose a method to choose Monte Carlo simulation points and obtain quasi-realizations of the conditional field at fine designs through affine predictors. The points are chosen optimally in the sense that they minimize the posterior expected distance in measure between the excursion set and its reconstruction. The proposed method reduces the computational costs due to Monte Carlo simulations and enables the computation of quasi-realizations on fine designs in large dimensions. We apply this reconstruction approach to obtain realizations of an excursion set on a fine grid which allow us to give a new measure of uncertainty based on the distance transform of the excursion set. Finally we present a safety engineering test case where the simulation method is employed to compute a Monte Carlo estimate of a contour line

    Modification of stochastic ground motion models for matching target intensity measures

    Get PDF
    Stochastic ground motion models produce synthetic time‐histories by modulating a white noise sequence through functions that address spectral and temporal properties of the excitation. The resultant ground motions can be then used in simulation‐based seismic risk assessment applications. This is established by relating the parameters of the aforementioned functions to earthquake and site characteristics through predictive relationships. An important concern related to the use of these models is the fact that through current approaches in selecting these predictive relationships, compatibility to the seismic hazard is not guaranteed. This work offers a computationally efficient framework for the modification of stochastic ground motion models to match target intensity measures (IMs) for a specific site and structure of interest. This is set as an optimization problem with a dual objective. The first objective minimizes the discrepancy between the target IMs and the predictions established through the stochastic ground motion model for a chosen earthquake scenario. The second objective constraints the deviation from the model characteristics suggested by existing predictive relationships, guaranteeing that the resultant ground motions not only match the target IMs but are also compatible with regional trends. A framework leveraging kriging surrogate modeling is formulated for performing the resultant multi‐objective optimization, and different computational aspects related to this optimization are discussed in detail. The illustrative implementation shows that the proposed framework can provide ground motions with high compatibility to target IMs with small only deviation from existing predictive relationships and discusses approaches for selecting a final compromise between these two competing objectives

    Population-based algorithms for improved history matching and uncertainty quantification of Petroleum reservoirs

    Get PDF
    In modern field management practices, there are two important steps that shed light on a multimillion dollar investment. The first step is history matching where the simulation model is calibrated to reproduce the historical observations from the field. In this inverse problem, different geological and petrophysical properties may provide equally good history matches. Such diverse models are likely to show different production behaviors in future. This ties the history matching with the second step, uncertainty quantification of predictions. Multiple history matched models are essential for a realistic uncertainty estimate of the future field behavior. These two steps facilitate decision making and have a direct impact on technical and financial performance of oil and gas companies. Population-based optimization algorithms have been recently enjoyed growing popularity for solving engineering problems. Population-based systems work with a group of individuals that cooperate and communicate to accomplish a task that is normally beyond the capabilities of each individual. These individuals are deployed with the aim to solve the problem with maximum efficiency. This thesis introduces the application of two novel population-based algorithms for history matching and uncertainty quantification of petroleum reservoir models. Ant colony optimization and differential evolution algorithms are used to search the space of parameters to find multiple history matched models and, using a Bayesian framework, the posterior probability of the models are evaluated for prediction of reservoir performance. It is demonstrated that by bringing latest developments in computer science such as ant colony, differential evolution and multiobjective optimization, we can improve the history matching and uncertainty quantification frameworks. This thesis provides insights into performance of these algorithms in history matching and prediction and develops an understanding of their tuning parameters. The research also brings a comparative study of these methods with a benchmark technique called Neighbourhood Algorithms. This comparison reveals the superiority of the proposed methodologies in various areas such as computational efficiency and match quality

    Quantification of uncertainty of geometallurgical variables for mine planning optimisation

    Get PDF
    Interest in geometallurgy has increased significantly over the past 15 years or so because of the benefits it brings to mine planning and operation. Its use and integration into design, planning and operation is becoming increasingly critical especially in the context of declining ore grades and increasing mining and processing costs. This thesis, comprising four papers, offers methodologies and methods to quantify geometallurgical uncertainty and enrich the block model with geometallurgical variables, which contribute to improved optimisation of mining operations. This enhanced block model is termed a geometallurgical block model. Bootstrapped non-linear regression models by projection pursuit were built to predict grindability indices and recovery, and quantify model uncertainty. These models are useful for populating the geometallurgical block model with response attributes. New multi-objective optimisation formulations for block caving mining were formulated and solved by a meta-heuristics solver focussing on maximising the project revenue and, at the same time, minimising several risk measures. A novel clustering method, which is able to use both continuous and categorical attributes and incorporate expert knowledge, was also developed for geometallurgical domaining which characterises the deposit according to its metallurgical response. The concept of geometallurgical dilution was formulated and used for optimising production scheduling in an open-pit case study.Thesis (Ph.D.) (Research by Publication) -- University of Adelaide, School of Civil, Environmental and Mining Engineering, 201

    Evidence-based robust optimization of pulsed laser orbital debris removal under epistemic uncertainty

    Get PDF
    An evidence-based robust optimization method for pulsed laser orbital debris removal (LODR) is presented. Epistemic type uncertainties due to limited knowledge are considered. The objective of the design optimization is set to minimize the debris lifetime while at the same time maximizing the corresponding belief value. The Dempster–Shafer theory of evidence (DST), which merges interval-based and probabilistic uncertainty modeling, is used to model and compute the uncertainty impacts. A Kriging based surrogate is used to reduce the cost due to the expensive numerical life prediction model. Effectiveness of the proposed method is illustrated by a set of benchmark problems. Based on the method, a numerical simulation of the removal of Iridium 33 with pulsed lasers is presented, and the most robust solutions with minimum lifetime under uncertainty are identified using the proposed method

    Hazard-compatible modification of stochastic ground motion models

    Get PDF
    A computationally efficient framework is presented for modification of stochastic ground motion models to establish compatibility with the seismic hazard for specific seismicity scenarios and a given structure/site. The modification pertains to the probabilistic predictive models that relate the parameters of the ground motion model to seismicity/site characteristics. These predictive models are defined through a mean prediction and an associated variance, and both these properties are modified in the proposed framework. For a given seismicity scenario, defined for example by the moment magnitude and source-to-site distance, the conditional hazard is described through the mean and the dispersion of some structure-specific intensity measure(s). Therefore, for both the predictive models and the seismic hazard, a probabilistic description is considered, extending previous work of the authors that had examined description only through mean value characteristics. The proposed modification is defined as a bi-objective optimization. The first objective corresponds to comparison for a chosen seismicity scenario between the target hazard and the predictions established through the stochastic ground motion model. The second objective corresponds to comparison of the modified predictive relationships to the pre-existing ones that were developed considering regional data, and guarantees that the resultant ground motions will have features compatible with observed trends. The relative entropy is adopted to quantify both objectives, and a computational framework relying on kriging surrogate modeling is established for an efficient optimization. Computational discussions focus on the estimation of the various statistics of the stochastic ground motion model output needed for the entropy calculation
    • 

    corecore