128 research outputs found

    Methods of likelihood based inference for constructing stochastic climate models

    Get PDF
    This thesis is about the construction of low dimensional diffusion models of climate variables. It assesses the predictive skill of models derived from a principled averaging procedure and a purely empirical approach. The averaging procedure starts from the equations for the original system then approximates the \weather" variables by a stochastic process. They are then averaged with respect to their invariant measure. This assumes that they equilibriate much faster than the climate variables. The empirical approach argues for a very general model form, then parameters are estimated using likelihood based inference for Stochastic Differential Equations. This is computationally demanding and relies upon Markov Chain Monte Carlo methods. A large part of this thesis is focused upon techniques to improve the efficiency of these algorithms. The empirical approach works well on simple one dimensional models but performs poorly on multivariate problems due to the rapid increase in unknown parameters. The averaging procedure is skillful in multivariate problems but is sensitive to lack of complete time scale separation in the system. In conclusion, the averaging procedure is better and can be improved by estimating parameters in a principled way based on the likelihood function and by including a latent noise process in the model

    Untangling hotel industry’s inefficiency: An SFA approach applied to a renowned Portuguese hotel chain

    Get PDF
    The present paper explores the technical efficiency of four hotels from Teixeira Duarte Group - a renowned Portuguese hotel chain. An efficiency ranking is established from these four hotel units located in Portugal using Stochastic Frontier Analysis. This methodology allows to discriminate between measurement error and systematic inefficiencies in the estimation process enabling to investigate the main inefficiency causes. Several suggestions concerning efficiency improvement are undertaken for each hotel studied.info:eu-repo/semantics/publishedVersio

    Multiscale Simulation of Polymeric Fluids using Sparse Grids

    Get PDF
    The numerical simulation of non-Newtonian fluids is of high practical relevance since most complex fluids developed in the chemical industry are not correctly modeled by classical fluid mechanics. In this thesis, we implement a multiscale multi-bead-spring chain model into the three-dimensional Navier-Stokes solver NaSt3DGPF developed at the Institute for Numerical Simulation, University of Bonn. It is the first implementation of such a high-dimensional model for non-Newtonian fluids into a three-dimensional flow solver. Using this model, we present novel simulation results for a square-square contraction flow problem. We then compare the results of our 3D simulations with experimental measurements from the literature and obtain a very good agreement. Up to now, high-dimensional multiscale approaches are hardly used in practical applications as they lead to computing times in the order of months even on massively parallel computers. This thesis combines two approaches to reduce this enormous computational complexity. First, we use a domain decomposition with MPI to allow for massively parallel computations. Second, we employ a dimension-adaptive sparse grid variant, the combination technique, to reduce the computational complexity of the multiscale model. Here, the combination technique is used in a general formulation that balances not only different discretization errors but also considers the accuracy of the mathematical model

    A Unified Model for XVA, including Interest Rates and Rating

    Get PDF
    We start in Chapter 2 to investigate linear matrix-valued SDEs and the ItĂ´-stochastic Magnus expansion. The ItĂ´-stochastic Magnus expansion provides an efficient numerical scheme to solve matrix-valued SDEs. We show convergence of the expansion up to a stopping time Ď„ and provide an asymptotic estimate of the cumulative distribution function of Ď„. Moreover, we show how to apply it to solve SPDEs with one and two spatial dimensions by combining it with the method of lines with high accuracy. We will see that the Magnus expansion allows us to use GPU techniques leading to major performance improvements compared to a standard Euler-Maruyama scheme. In Chapter 3, we study a short-rate model in a Cox-Ingersoll-Ross (CIR) framework for negative interest rates. We define the short rate as the difference of two independent CIR processes and add a deterministic shift to guarantee a perfect fit to the market term structure. We show how to use the Gram-Charlier expansion to efficiently calibrate the model to the market swaption surface and price Bermudan swaptions with good accuracy. We are taking two different perspectives for rating transition modelling. In Section 4.4, we study inhomogeneous continuous-time Markov chains (ICTMC) as a candidate for a rating model with deterministic rating transitions. We extend this model by taking a Lie group perspective in Section 4.5, to allow for stochastic rating transitions. In both cases, we will compare the most popular choices for a change of measure technique and show how to efficiently calibrate both models to the available historical rating data and market default probabilities. At the very end, we apply the techniques shown in this thesis to minimize the collateral-inclusive Credit/ Debit Valuation Adjustments under the constraint of small collateral postings by using a collateral account dependent on rating trigger

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications
    • …
    corecore