203 research outputs found

    Gaussian process for ground-motion prediction and emulation of systems of computer models

    Get PDF
    In this thesis, several challenges in both ground-motion modelling and the surrogate modelling, are addressed by developing methods based on Gaussian processes (GP). The first chapter contains an overview of the GP and summarises the key findings of the rest of the thesis. In the second chapter, an estimation algorithm, called the Scoring estimation approach, is developed to train GP-based ground-motion models with spatial correlation. The Scoring estimation approach is introduced theoretically and numerically, and it is proven to have desirable properties on convergence and computation. It is a statistically robust method, producing consistent and statistically efficient estimators of spatial correlation parameters. The predictive performance of the estimated ground-motion model is assessed by a simulation-based application, which gives important implications on the seismic risk assessment. In the third chapter, a GP-based surrogate model, called the integrated emulator, is introduced to emulate a system of multiple computer models. It generalises the state-of-the-art linked emulator for a system of two computer models and considers a variety of kernels (exponential, squared exponential, and two key Matérn kernels) that are essential in advanced applications. By learning the system structure, the integrated emulator outperforms the composite emulator, which emulates the entire system using only global inputs and outputs. Furthermore, its analytic expressions allow a fast and efficient design algorithm that could yield significant computational and predictive gains by allocating different runs to individual computer models based on their heterogeneous functional complexity. The benefits of the integrated emulator are demonstrated in a series of synthetic experiments and a feed-back coupled fire-detection satellite model. Finally, the developed method underlying the integrated emulator is used to construct a non-stationary Gaussian process model based on deep Gaussian hierarchy

    Multi-Fidelity Gaussian Process Emulation And Its Application In The Study Of Tsunami Risk Modelling

    Get PDF
    Investigating uncertainties in computer simulations can be prohibitive in terms of computational costs, since the simulator needs to be run over a large number of input values. Building a statistical surrogate model of the simulator, using a small design of experiments, greatly alleviates the computational burden to carry out such investigations. Nevertheless, this can still be above the computational budget for many studies. We present a novel method that combines both approaches, the multilevel adaptive sequential design of computer experiments (MLASCE) in the framework of Gaussian process (GP) emulators. MLASCE is based on the two major approaches: efficient design of experiments, such as sequential designs, and combining training data of different degrees of sophistication in a so-called multi-fidelity method, or multilevel in case these fidelities are ordered typically for increasing resolutions. This dual strategy allows us to allocate efficiently limited computational resources over simulations of different levels of fidelity and build the GP emulator. The allocation of computational resources is shown to be the solution of a simple optimization problem in a special case where we theoretically prove the validity of our approach. MLASCE is compared with other existing models of multi-fidelity Gaussian process emulation. Gains of orders of magnitudes in accuracy for medium-size computing budgets are demonstrated in numerical examples. MLASCE should be useful in a computer experiment of a natural disaster risk and more than a mere tool for calculating the scale of natural disasters. To show MLASCE meets this expectation, we propose the first end-to-end example of a risk model for household asset loss due to a possible future tsunami. As a follow-up to this proposed framework, MLASCE provides a reliable statistical surrogate to a realistic tsunami risk assessment under a restricted computational resource and provides accurate and instant predictions of future tsunami risks

    Accelerating inference in cosmology and seismology with generative models

    Get PDF
    Statistical analyses in many physical sciences require running simulations of the system that is being examined. Such simulations provide complementary information to the theoretical analytic models, and represent an invaluable tool to investigate the dynamics of complex systems. However, running simulations is often computationally expensive, and the high number of required mocks to obtain sufficient statistical precision often makes the problem intractable. In recent years, machine learning has emerged as a possible solution to speed up the generation of scientific simulations. Machine learning generative models usually rely on iteratively feeding some true simulations to the algorithm, until it learns the important common features and is capable of producing accurate simulations in a fraction of the time. In this thesis, advanced machine learning algorithms are explored and applied to the challenge of accelerating physical simulations. Various techniques are applied to problems in cosmology and seismology, showing benefits and limitations of such an approach through a critical analysis. The algorithms are applied to compelling problems in the fields, including surrogate models for the seismic wave equation, the emulation of cosmological summary statistics, and the fast generation of large simulations of the Universe. These problems are formulated within a relevant statistical framework, and tied to real data analysis pipelines. In the conclusions, a critical overview of the results is provided, together with an outlook over possible future expansions of the work presented in the thesis

    Improvement of code behaviour in a design of experiments by metamodeling

    Get PDF
    It is now common practice in nuclear engineering to base extensive studies on numerical computer models. These studies require to run computer codes in potentially thousands of numerical configurations and without expert individual controls on the computational and physical aspects of each simulations.In this paper, we compare different statistical metamodeling techniques and show how metamodels can help to improve the global behaviour of codes in these extensive studies. We consider the metamodeling of the Germinal thermalmechanical code by Kriging, kernel regression and neural networks. Kriging provides the most accurate predictions while neural networks yield the fastest metamodel functions. All three metamodels can conveniently detect strong computation failures. It is however significantly more challenging to detect code instabilities, that is groups of computations that are all valid, but numerically inconsistent with one another. For code instability detection, we find that Kriging provides the most useful tools

    Modelling the interaction between induced pluripotent stem cells derived cardiomyocytes patches and the recipient hearts

    Get PDF
    Cardiovascular diseases are the main cause of death worldwide. The single biggest killer is represented by ischemic heart disease. Myocardial infarction causes the formation of non-conductive and non-contractile, scar-like tissue in the heart, which can hamper the heart's physiological function and cause pathologies ranging from arrhythmias to heart failure. The heart can not recover the tissue lost due to myocardial infarction due to the myocardium's limited ability to regenerate. The only available treatment is heart transpalant, which is limited by the number of donors and can elicit an adverse response from the recipients immune system. Recently, regenerative medicine has been proposed as an alternative approach to help post-myocardial infarction hearts recover their functionality. Among the various techniques, the application of cardiac patches of engineered heart tissue in combination with electroactive materials constitutes a promising technology. However, many challenges need to be faced in the development of this treatment. One of the main concerns is represented by the immature phenotype of the stem cells-derived cardiomyocytes used to fabricate the engineered heart tissue. Their electrophysiological differences with respect to the host myocardium may contribute to an increased arrhythmia risk. A large number of animal experiments are needed to optimize the patches' characteristics and to better understand the implications of the electrical interaction between patches and host myocardium. In this Thesis we leveraged cardiac computational modelling to simulate \emph{in silico} electrical propagation in scarred heart tissue in the presence of a patch of engineered heart tissue and conductive polymer engrafted at the epicardium. This work is composed by two studies. In the first study we designed a tissue model with simplified geometry and used machine learning and global sensitivity analysis techniques to identify engineered heart tissue patch design variables that are important for restoring physiological electrophysiology in the host myocardium. Additionally, we showed how engineered heart tissue properties could be tuned to restore physiological activation while reducing arrhythmic risk. In the second study we moved to more realistic geometries and we devised a way to manipulate ventricle meshes obtained from magnetic resonance images to apply \emph{in silico} engineered heart tissue epicardial patches. We then investigated how patches with different conduction velocity and action potential duration influence the host ventricle electrophysiology. Specifically, we showed that appropriately located patches can reduce the predisposition to anatomical isthmus mediated re-entry and that patches with a physiological action potential duration and higher conduction velocity were most effective in reducing this risk. We also demonstrated that patches with conduction velocity and action potential duration typical of immature stem cells-derived cardiomyocytes were associated with the onset of sustained functional re-entry in an ischemic cardiomyopathy model with a large transmural scar. Finally, we demonstrated that patches electrically coupled to host myocardium reduce the likelihood of propagation of focal ectopic impulses. This Thesis demonstrates how computational modelling can be successfully applied to the field of regenerative medicine and constitutes the first step towards the creation of patient-specific models for developing and testing patches for cardiac regeneration.Open Acces

    Uncertainty quantification and management in multidisciplinary design optimisation.

    Get PDF
    We analyse the uncertainty present at the structural-sizing stage of aircraft design due to interactions between aeroelastic loading and incomplete structural definition. In particular, we look at critical load case identification: the process of identifying the flight conditions at which the maximum loading conditions occur from sparse, expensive to obtain data. To address this challenge, we investigate the construction of robust emulators: probabilistic models of computer code outputs, which explicitly and reliably model their predictive uncertainty. Using Gaussian process regression, we show how such models can be derived from simple and intuitive considerations about the interactions between parameter inference and data, and via state-of-the- art statistical software, develop a generally applicable and easy to use method for constructing them. The effectiveness of these models is demonstrated on a range of synthetic and engineering test functions. We then use them to approach two facets of critical load case identification: sample efficient searching for the critical cases via Bayesian optimisation, and probabilistic assessment of possible locations for the critical cases from a given sample; the latter facilitating quantitative downselection of candidate load cases by ruling out regions of the search space with a low probability of containing the critical cases, potentially saving a designer many hours of simulation time. Finally, we show how the presence of design variability in the loads analysis implies a stochastic process, and attempt to construct a model for this by parametrisation of its marginal distributions.PhD in Aerospac

    Recent developments in Quantum Monte-Carlo simulations with applications for cold gases

    Full text link
    This is a review of recent developments in Monte Carlo methods in the field of ultra cold gases. For bosonic atoms in an optical lattice we discuss path integral Monte Carlo simulations with worm updates and show the excellent agreement with cold atom experiments. We also review recent progress in simulating bosonic systems with long-range interactions, disordered bosons, mixtures of bosons, and spinful bosonic systems. For repulsive fermionic systems determinantal methods at half filling are sign free, but in general no sign-free method exists. We review the developments in diagrammatic Monte Carlo for the Fermi polaron problem and the Hubbard model, and show the connection with dynamical mean-field theory. We end the review with diffusion Monte Carlo for the Stoner problem in cold gases.Comment: 68 pages, 22 figures, review article; replaced with published versio
    corecore