106 research outputs found

    Some considerations regarding the use of multi-fidelity Kriging in the construction of surrogate models

    No full text
    Surrogate models or metamodels are commonly used to exploit expensive computational simulations within a design optimization framework. The application of multi-fidelity surrogate modeling approaches has recently been gaining ground due to the potential for further reductions in simulation effort over single fidelity approaches. However, given a black box problem when exactly should a designer select a multi-fidelity approach over a single fidelity approach and vice versa? Using a series of analytical test functions and engineering design examples from the literature, the following paper illustrates the potential pitfalls of choosing one technique over the other without a careful consideration of the optimization problem at hand. These examples are then used to define and validate a set of guidelines for the creation of a multi-fidelity Kriging model. The resulting guidelines state that the different fidelity functions should be well correlated, that the amount of low fidelity data in the model should be greater than the amount of high fidelity data and that more than 10\% and less than 80\% of the total simulation budget should be spent on low fidelity simulations in order for the resulting multi-fidelity model to perform better than the equivalent costing high fidelity model

    CPA\u27s guide to sophisticated estate planning techniques

    Get PDF
    https://egrove.olemiss.edu/aicpa_guides/1300/thumbnail.jp

    Engineering design applications of surrogate-assisted optimization techniques

    No full text
    The construction of models aimed at learning the behaviour of a system whose responses to inputs are expensive to measure is a branch of statistical science that has been around for a very long time. Geostatistics has pioneered a drive over the last half century towards a better understanding of the accuracy of such ‘surrogate’ models of the expensive function. Of particular interest to us here are some of the even more recent advances related to exploiting such formulations in an optimization context. While the classic goal of the modelling process has been to achieve a uniform prediction accuracy across the domain, an economical optimization process may aim to bias the distribution of the learning budget towards promising basins of attraction. This can only happen, of course, at the expense of the global exploration of the space and thus finding the best balance may be viewed as an optimization problem in itself. We examine here a selection of the state of-the-art solutions to this type of balancing exercise through the prism of several simple, illustrative problems, followed by two ‘real world’ applications: the design of a regional airliner wing and the multi-objective search for a low environmental impact hous

    CPA\u27s basic guide to proven estate planning strategies to protect client wealth

    Get PDF
    https://egrove.olemiss.edu/aicpa_guides/1445/thumbnail.jp

    Improving the optimisation performance of an ensemble of radial basis functions

    No full text
    In this paper we investigate surrogate-based optimisation performance using two different ensemble approaches, and a novel update strategy based on the local Pearson correlation coefficient. The ?first ensemble, is based on a selective approach, where ns RBFs are constructed and the most accurate RBF is selected for prediction at each iteration, while the others are ignored. The secondensemble uses a combined approach, which takes advantage of ns different RBFs, in the hope of reducing errors in the prediction through a weighted combination of the RBFs used. The update strategy uses the local Pearson correlation coefficient as a constraint to ignore domain areas wherethere is disagreement between the surrogates. In total the performance of six different approaches are investigated, using ?five analytical test functions with 2 to 50 dimensions, and one engineering problem related to the frequency response of a satellite boom with 2 to 40 dimensions

    Ocular Microtremor Laser Speckle Metrology

    Get PDF
    Ocular Microtremor (OMT) is a continual, high frequency physiological tremor of the eye present in all subjects even when the eye is apparently at rest. OMT causes a peak to peak displacement of around 150nm-2500nm with a broadband frequency spectrum between 30Hz to 120Hz; with a peak at about 83Hz. OMT carries useful clinical information on depth of consciousness and on some neurological disorders. Nearly all quantitative clinical investigations have been based on OMT measurements using an eye contacting piezoelectric probe which has low clinical acceptability. Laser speckle metrology is a candidate for a high resolution, non-contacting, compact, portable OMT measurement technique. However, tear flow and biospeckle might be expected to interfere with the displacement information carried by the speckle. The paper investigates the properties of the scattered speckle of laser light (λ = 632.8nm) from the eye sclera to assess the feasibility of using speckle techniques to measure OMT such as the speckle correlation. The investigation is carried using a high speed CMOS video camera adequate to capture the high frequency of the tremor. The investigation is supported by studies using an eye movement simulator (a bovine sclera driven by piezoelectric bimorphs). The speckle contrast and the frame to frame spatiotemporal variations are analyzed to determine if the OMT characteristics are detectable within speckle changes induced by the biospeckle or other movements

    An Open Source Software Culture in the Undergraduate Computer Science Curriculum

    Get PDF
    Open source software has made inroads into mainstream computing where it was once the territory of software altruists, and the open source culture of technological collegiality and accountability may benefit education as well as industry. This paper describes the Recourse project, which seeks to transform the computer science undergraduate curriculum through teaching methods based on open source principles, values, ethics, and tools. Recourse differs from similar projects by bringing the open source culture into the curriculum comprehensively, systematically, and institutionally. The current state of the project is described, and initial results from a pilot exercise are presented

    Building a traceable climate model hierarchy with multi-level emulators

    Get PDF
    To study climate change on multi-millennial timescales or to explore a model’s parameter space, efficient models with simplified and parameterised processes are required. However, the reduction in explicitly modelled processes can lead to underestimation of some atmospheric responses that are essential to the understanding of the climate system. While more complex general circulations are available and capable of simulating a more realistic climate, they are too computationally intensive for these purposes. In this work, we propose a multi-level Gaussian emulation technique to efficiently estimate the outputs of steady-state simulations of an expensive atmospheric model in response to changes in boundary forcing. The link between a computationally expensive atmospheric model, PLASIM (Planet Simulator), and a cheaper model, EMBM (energy–moisture balance model), is established through the common boundary condition specified by an ocean model, allowing for information to be propagated from one to the other. This technique allows PLASIM emulators to be built at a low cost. The method is first demonstrated by emulating a scalar summary quantity, the global mean surface air temperature. It is then employed to emulate the dimensionally reduced 2-D surface air temperature field. Even though the two atmospheric models chosen are structurally unrelated, Gaussian process emulators of PLASIM atmospheric variables are successfully constructed using EMBM as a fast approximation. With the extra information gained from the cheap model, the multi-level emulator of PLASIM’s 2-D surface air temperature field is built using only one-third the amount of expensive data required by the normal single-level technique. The constructed emulator is shown to capture 93.2% of the variance across the validation ensemble, with the averaged RMSE of 1.33 °C. Using the method proposed, quantities from PLASIM can be constructed and used to study the effects introduced by PLASIM’s atmosphere
    corecore