182 research outputs found

    Simulation of asset prices using LĂ©vy processes

    Get PDF
    Includes bibliographical references (leaves 93-97).This dissertation focuses on a LĂ©vy process driven framework for the pricing of financial instruments. The main focus of this dissertation is not, however, to price these instruments; the main focus is simulation based. Simulation is a key issue under Monte Carlo pricing and risk-neutral valuation- it is the first step towards pricing and therefore must be done accurately and with care. This dissertation looks at different kinds of LĂ©vy processes and the various approaches one can take when simulating them

    Computational applications in stochastic operations research

    Get PDF
    Several computational applications in stochastic operations research are presented, where, for each application, a computational engine is used to achieve results that are otherwise overly tedious by hand calculations, or in some cases mathematically intractable. Algorithms and code are developed and implemented with specific emphasis placed on achieving exact results and substantiated via Monte Carlo simulation. The code for each application is provided in the software language utilized and algorithms are available for coding in another environment. The topics include univariate and bivariate nonparametric random variate generation using a piecewise-linear cumulative distribution, deriving exact statistical process control chart constants for non-normal sampling, testing probability distribution conformance to Benford\u27s law, and transient analysis of M/M/s queueing systems. The nonparametric random variate generation chapters provide the modeler with a method of generating univariate and bivariate samples when only observed data is available. The method is completely nonparametric and is capable of mimicking multimodal joint distributions. The algorithm is black-box, where no decisions are required from the modeler in generating variates for simulation. The statistical process control chart constant chapter develops constants for select non-normal distributions, and provides tabulated results for researchers who have identified a given process as non-normal The constants derived are bias correction factors for the sample range and sample standard deviation. The Benford conformance testing chapter offers the Kolmogorov-Smirnov test as an alternative to the standard chi-square goodness-of-fit test when testing whether leading digits of a data set are distributed according to Benford\u27s law. The alternative test has the advantage of being an exact test for all sample sizes, removing the usual sample size restriction involved with the chi-square goodness-of-fit test. The transient queueing analysis chapter develops and automates the construction of the sojourn time distribution for the nth customer in an M/M/s queue with k customers initially present at time 0 (k ≄ 0) without the usual limit on traffic intensity, rho \u3c 1, providing an avenue to conduct transient analysis on various measures of performance for a given initial number of customers in the system. It also develops and automates the construction of the sojourn time joint probability distribution function for pairs of customers, allowing the calculation of the exact covariance between customer sojourn times

    Computer generation of directional data.

    Get PDF
    by Carl Ka-fai Wong.Thesis (M.Phil.)--Chinese University of Hong Kong, 1991.Includes bibliographical references.Chapter Chapter 1 --- Introduction --- p.1Chapter §1.1 --- Directional Data and Computer Simulation --- p.1Chapter §1.2 --- Computer Simulation Techniques --- p.2Chapter §1.3 --- Implementation and Preliminaries --- p.4Chapter Chapter 2 --- Generating Random Points on the N-sphere --- p.6Chapter §2.1 --- Methods --- p.6Chapter §2.2 --- Comparison of Methods --- p.10Chapter Chapter 3 --- Generating Variates from Non-uniform Distributions on the Circle --- p.14Chapter §3.1 --- Introduction --- p.14Chapter §3.2 --- Methods for Circular Distributions --- p.15Chapter Chapter 4 --- Generating Variates from Non-uniform Distributions on the Sphere --- p.28Chapter §4.1 --- Introduction --- p.28Chapter §4.2 --- Methods for Spherical Distributions --- p.29Chapter Chapter 5 --- Generating Variates from Non-uniform Distributions on the N-sphere --- p.56Chapter §5.1 --- Introduction --- p.56Chapter §5.2 --- Methods for Higher Dimensional Spherical Distributions --- p.56Chapter Chapter 6 --- Summary and Discussion --- p.69References --- p.72Appendix 1 --- p.77Appendix 2 --- p.9

    Copula-Based Multivariate Hydrologic Frequency Analysis

    Get PDF
    Multivariate frequency distributions are being increasingly recognized for their role in hydrological design and risk management. The conventional multivariate distributions are severely limited in that all constituent marginals have to be from the same distribution family. The copula method is a newly emerging approach for deriving multivariate distributions which overcomes this limitation. Use of copula method in hydrological applications has begun only recently and ascertaining the applicability of different copulas for combinations of various hydrological variables is currently an area of active research. Since there exists a variety of copulas capable of characterizing a broad range of dependence, the selection of appropriate copulas for different hydrological applications becomes a non-trivial task. This study evaluates the relative performance of various copulas and methods of parameter estimation as well as of recently developed statistical inference procedures. Potential copulas for multivariate extreme flow and rainfall processes are then identified. Multivariate hydrological frequency analysis typically utilizes only the concurrent parts of observed data, leaving a lot of non-concurrent information unutilized. Uncertainty in distribution parameter estimates can be reduced by simultaneously including such non-concurrent data in the analysis. A new copula-based “Composite Likelihood Approach” that allows all available multivariate data of varying lengths to be combined and analyzed in an integrated manner has been developed. This approach yields additional information, enhancing the precision of parameter estimates that are otherwise obtained from either purely univariate or purely multivariate considerations. The approach can be advantageously employed in limited hydrological data situations in order to provide significant virtual augmentation of available data lengths by virtue of increased precision of parameter estimates. The effectiveness of a copula selection framework that helps in an a priori short listing of potentially viable copulas on the basis of dependence characteristics has been examined using several case studies pertaining to various extreme flow and rainfall variables. The benefits of the composite likelihood approach in terms of significant improvement in the precision of parameter estimates of commonly used distributions in hydrology, such as normal, Gumbel, gamma, and log-Pearson Type III, have been quantified

    Matching marginal moments and lag autocorrelations with MAPs

    Get PDF

    Matching marginal moments and lag autocorrelations with MAPs

    Get PDF
    This paper presents a procedure that constructs a Markovian Arrival Process (MAP) based on the mean, the squared coefficient of variation and the lag-1 autocorrelation of the inter-arrival times. This method always provides a valid MAP without posing any restrictions on the three input parameters. Besides matching these three parameters, it is possible to match the third moment of the inter-arrival times and the decay of the autocorrelation function as well, if they fall into the given (very wide) bounds

    A quantitative real options method for aviation technology decision-making in the presence of uncertainty

    Get PDF
    The developments of new technologies for commercial aviation involve significant risk for technologists as these programs are often driven by fixed assumptions regarding future airline needs, while being subject to many uncertainties at the technical and market levels. To prioritize these developments, technologists must assess their economic viability even though standard methods used for capital budgeting are not well suited to handle the overwhelming uncertainty surrounding such developments. This research proposes a framework featuring real options to overcome this challenge. It is motivated by three observations: disregarding the value of managerial flexibility undervalues long-term research and development (R&D) programs; windows of opportunities emerge and disappear and manufacturers can derive significant value by exploiting their upside potential; integrating competitive aspects early in the design ensures that development programs are robust with respect to moves by the competition. Real options analyses have been proposed to address some of these points but the adoption has been slow, hindered by constraining frameworks. A panel of academics and practitioners has identified a set of requirements, known as the Georgetown Challenge, that real options analyses must meet to get more traction amongst practitioners in the industry. In a bid to meet some of these requirements, this research proposes a novel methodology, cross-fertilizing techniques from financial engineering, actuarial sciences, and statistics to evaluate and study the timing of technology developments under uncertainty. It aims at substantiating decision making for R&D while having a wider domain of application and an improved ability to handle a complex reality compared to more traditional approaches. The method named FLexible AViation Investment Analysis (FLAVIA) uses first Monte Carlo techniques to simulate the evolution of uncertainties driving the value of technology developments. A non-parametric Esscher transform is then applied to perform a change of probability measure to express these evolutions under the equivalent martingale measure. A bootstrap technique is suggested next to construct new non-weighted evolutions of the technology development value under the new measure. A regression-based technique is finally used to analyze the technology development program and to discover trigger boundaries which help define when the technology development program should be launched. Verification of the method is performed on several canonical examples and indicates good accuracy and competitive execution time. It is applied next to the analysis of a performance improvement package (PIP) development using the Integrated Cost And Revenue Estimation method (i-CARE) developed as part of this research. The PIP can be retrofitted to currently operating turbofan engines in order to mitigate the impact of the aging process on their operating costs. The PIP is subject to market uncertainties, such as the evolution of jet-fuel prices and the possible taxation of carbon emissions. The profitability of the PIP development is investigated and the value of managerial flexibility and timing flexibility are highlighted.The developments of new technologies for commercial aviation involve significant risk for technologists as these programs are often driven by fixed assumptions regarding future airline needs, while being subject to many uncertainties at the technical and market levels. To prioritize these developments, technologists must assess their economic viability even though standard methods used for capital budgeting are not well suited to handle the overwhelming uncertainty surrounding such developments. This research proposes a framework featuring real options to overcome this challenge. It is motivated by three observations: disregarding the value of managerial flexibility undervalues long-term research and development (R&D) programs; windows of opportunities emerge and disappear and manufacturers can derive significant value by exploiting their upside potential; integrating competitive aspects early in the design ensures that development programs are robust with respect to moves by the competition. Real options analyses have been proposed to address some of these points but the adoption has been slow, hindered by constraining frameworks. A panel of academics and practitioners has identified a set of requirements, known as the Georgetown Challenge, that real options analyses must meet to get more traction amongst practitioners in the industry. In a bid to meet some of these requirements, this research proposes a novel methodology, cross-fertilizing techniques from financial engineering, actuarial sciences, and statistics to evaluate and study the timing of technology developments under uncertainty. It aims at substantiating decision making for R&D while having a wider domain of application and an improved ability to handle a complex reality compared to more traditional approaches. The method named FLexible AViation Investment Analysis (FLAVIA) uses first Monte Carlo techniques to simulate the evolution of uncertainties driving the value of technology developments. A non-parametric Esscher transform is then applied to perform a change of probability measure to express these evolutions under the equivalent martingale measure. A bootstrap technique is suggested next to construct new non-weighted evolutions of the technology development value under the new measure. A regression-based technique is finally used to analyze the technology development program and to discover trigger boundaries which help define when the technology development program should be launched. Verification of the method is performed on several canonical examples and indicates good accuracy and competitive execution time. It is applied next to the analysis of a performance improvement package (PIP) development using the Integrated Cost And Revenue Estimation method (i-CARE) developed as part of this research. The PIP can be retrofitted to currently operating turbofan engines in order to mitigate the impact of the aging process on their operating costs. The PIP is subject to market uncertainties, such as the evolution of jet-fuel prices and the possible taxation of carbon emissions. The profitability of the PIP development is investigated and the value of managerial flexibility and timing flexibility are highlighted.Ph.D

    Inference and parameter estimation for diffusion processes

    Get PDF
    Diffusion processes provide a natural way of modelling a variety of physical and economic phenomena. It is often the case that one is unable to observe a diffusion process directly, and must instead rely on noisy observations that are discretely spaced in time. Given these discrete, noisy observations, one is faced with the task of inferring properties of the underlying diffusion process. For example, one might be interested in inferring the current state of the process given observations up to the present time (this is known as the filtering problem). Alternatively, one might wish to infer parameters governing the time evolution the diffusion process. In general, one cannot apply Bayes’ theorem directly, since the transition density of a general nonlinear diffusion is not computationally tractable. In this thesis, we investigate a novel method of simplifying the problem. The stochastic differential equation that describes the diffusion process is replaced with a simpler ordinary differential equation, which has a random driving noise that approximates Brownian motion. We show how one can exploit this approximation to improve on standard methods for inferring properties of nonlinear diffusion processes
    • 

    corecore