3,808 research outputs found

    Random Bit Multilevel Algorithms for Stochastic Differential Equations

    Full text link
    We study the approximation of expectations \E(f(X)) for solutions XX of SDEs and functionals f ⁣:C([0,1],Rr)Rf \colon C([0,1],\R^r) \to \R by means of restricted Monte Carlo algorithms that may only use random bits instead of random numbers. We consider the worst case setting for functionals ff from the Lipschitz class w.r.t.\ the supremum norm. We construct a random bit multilevel Euler algorithm and establish upper bounds for its error and cost. Furthermore, we derive matching lower bounds, up to a logarithmic factor, that are valid for all random bit Monte Carlo algorithms, and we show that, for the given quadrature problem, random bit Monte Carlo algorithms are at least almost as powerful as general randomized algorithms

    Central limit theorems for multilevel Monte Carlo methods

    Full text link
    In this work, we show that uniform integrability is not a necessary condition for central limit theorems (CLT) to hold for normalized multilevel Monte Carlo (MLMC) estimators and we provide near optimal weaker conditions under which the CLT is achieved. In particular, if the variance decay rate dominates the computational cost rate (i.e., β>γ\beta > \gamma), we prove that the CLT applies to the standard (variance minimizing) MLMC estimator. For other settings where the CLT may not apply to the standard MLMC estimator, we propose an alternative estimator, called the mass-shifted MLMC estimator, to which the CLT always applies. This comes at a small efficiency loss: the computational cost of achieving mean square approximation error O(ϵ2)\mathcal{O}(\epsilon^2) is at worst a factor O(log(1/ϵ))\mathcal{O}(\log(1/\epsilon)) higher with the mass-shifted estimator than with the standard one

    Optimization of mesh hierarchies in Multilevel Monte Carlo samplers

    Full text link
    We perform a general optimization of the parameters in the Multilevel Monte Carlo (MLMC) discretization hierarchy based on uniform discretization methods with general approximation orders and computational costs. We optimize hierarchies with geometric and non-geometric sequences of mesh sizes and show that geometric hierarchies, when optimized, are nearly optimal and have the same asymptotic computational complexity as non-geometric optimal hierarchies. We discuss how enforcing constraints on parameters of MLMC hierarchies affects the optimality of these hierarchies. These constraints include an upper and a lower bound on the mesh size or enforcing that the number of samples and the number of discretization elements are integers. We also discuss the optimal tolerance splitting between the bias and the statistical error contributions and its asymptotic behavior. To provide numerical grounds for our theoretical results, we apply these optimized hierarchies together with the Continuation MLMC Algorithm. The first example considers a three-dimensional elliptic partial differential equation with random inputs. Its space discretization is based on continuous piecewise trilinear finite elements and the corresponding linear system is solved by either a direct or an iterative solver. The second example considers a one-dimensional It\^o stochastic differential equation discretized by a Milstein scheme

    Random Bit Quadrature and Approximation of Distributions on Hilbert Spaces

    Full text link
    We study the approximation of expectations \E(f(X)) for Gaussian random elements XX with values in a separable Hilbert space HH and Lipschitz continuous functionals f ⁣:HRf \colon H \to \R. We consider restricted Monte Carlo algorithms, which may only use random bits instead of random numbers. We determine the asymptotics (in some cases sharp up to multiplicative constants, in the other cases sharp up to logarithmic factors) of the corresponding nn-th minimal error in terms of the decay of the eigenvalues of the covariance operator of XX. It turns out that, within the margins from above, restricted Monte Carlo algorithms are not inferior to arbitrary Monte Carlo algorithms, and suitable random bit multilevel algorithms are optimal. The analysis of this problem leads to a variant of the quantization problem, namely, the optimal approximation of probability measures on HH by uniform distributions supported by a given, finite number of points. We determine the asymptotics (up to multiplicative constants) of the error of the best approximation for the one-dimensional standard normal distribution, for Gaussian measures as above, and for scalar autonomous SDEs

    Multilevel Monte Carlo simulation for Levy processes based on the Wiener-Hopf factorisation

    Get PDF
    In Kuznetsov et al. (2011) a new Monte Carlo simulation technique was introduced for a large family of Levy processes that is based on the Wiener-Hopf decomposition. We pursue this idea further by combining their technique with the recently introduced multilevel Monte Carlo methodology. Moreover, we provide here for the first time a theoretical analysis of the new Monte Carlo simulation technique in Kuznetsov et al. (2011) and of its multilevel variant for computing expectations of functions depending on the historical trajectory of a Levy process. We derive rates of convergence for both methods and show that they are uniform with respect to the "jump activity" (e.g. characterised by the Blumenthal-Getoor index). We also present a modified version of the algorithm in Kuznetsov et al. (2011) which combined with the multilevel methodology obtains the optimal rate of convergence for general Levy processes and Lipschitz functionals. This final result is only a theoretical one at present, since it requires independent sampling from a triple of distributions which is currently only possible for a limited number of processes

    Estimating expected first passage times using multilevel Monte Carlo algorithm

    Get PDF
    In this paper we devise a method of numerically estimating the expected first passage times of stochastic processes. We use Monte Carlo path simulations with Milstein discretisation scheme to approximate the solutions of scalar stochastic differential equations. To further reduce the variance of the estimated expected stopping time and improve computational efficiency, we use the multi-level Monte Carlo algorithm, recently developed by Giles (2008a), and other variance-reduction techniques. Our numerical results show significant improvements over conventional Monte Carlo techniques
    corecore