451,893 research outputs found

    High Performance Financial Simulation Using Randomized Quasi-Monte Carlo Methods

    Full text link
    GPU computing has become popular in computational finance and many financial institutions are moving their CPU based applications to the GPU platform. Since most Monte Carlo algorithms are embarrassingly parallel, they benefit greatly from parallel implementations, and consequently Monte Carlo has become a focal point in GPU computing. GPU speed-up examples reported in the literature often involve Monte Carlo algorithms, and there are software tools commercially available that help migrate Monte Carlo financial pricing models to GPU. We present a survey of Monte Carlo and randomized quasi-Monte Carlo methods, and discuss existing (quasi) Monte Carlo sequences in GPU libraries. We discuss specific features of GPU architecture relevant for developing efficient (quasi) Monte Carlo methods. We introduce a recent randomized quasi-Monte Carlo method, and compare it with some of the existing implementations on GPU, when they are used in pricing caplets in the LIBOR market model and mortgage backed securities

    Metropolis Methods for Quantum Monte Carlo Simulations

    Full text link
    Since its first description fifty years ago, the Metropolis Monte Carlo method has been used in a variety of different ways for the simulation of continuum quantum many-body systems. This paper will consider some of the generalizations of the Metropolis algorithm employed in quantum Monte Carlo: Variational Monte Carlo, dynamical methods for projector monte carlo ({\it i.e.} diffusion Monte Carlo with rejection), multilevel sampling in path integral Monte Carlo, the sampling of permutations, cluster methods for lattice models, the penalty method for coupled electron-ionic systems and the Bayesian analysis of imaginary time correlation functions.Comment: Proceedings of "Monte Carlo Methods in the Physical Sciences" Celebrating the 50th Anniversary of the Metropolis Algorith

    Comparative Monte Carlo Efficiency by Monte Carlo Analysis

    Full text link
    We propose a modified power method for computing the subdominant eigenvalue λ2\lambda_2 of a matrix or continuous operator. Here we focus on defining simple Monte Carlo methods for its application. The methods presented use random walkers of mixed signs to represent the subdominant eigenfuction. Accordingly, the methods must cancel these signs properly in order to sample this eigenfunction faithfully. We present a simple procedure to solve this sign problem and then test our Monte Carlo methods by computing the λ2\lambda_2 of various Markov chain transition matrices. We first computed λ2{\lambda_2} for several one and two dimensional Ising models, which have a discrete phase space, and compared the relative efficiencies of the Metropolis and heat-bath algorithms as a function of temperature and applied magnetic field. Next, we computed λ2\lambda_2 for a model of an interacting gas trapped by a harmonic potential, which has a mutidimensional continuous phase space, and studied the efficiency of the Metropolis algorithm as a function of temperature and the maximum allowable step size Δ\Delta. Based on the λ2\lambda_2 criterion, we found for the Ising models that small lattices appear to give an adequate picture of comparative efficiency and that the heat-bath algorithm is more efficient than the Metropolis algorithm only at low temperatures where both algorithms are inefficient. For the harmonic trap problem, we found that the traditional rule-of-thumb of adjusting Δ\Delta so the Metropolis acceptance rate is around 50% range is often sub-optimal. In general, as a function of temperature or Δ\Delta, λ2\lambda_2 for this model displayed trends defining optimal efficiency that the acceptance ratio does not. The cases studied also suggested that Monte Carlo simulations for a continuum model are likely more efficient than those for a discretized version of the model.Comment: 23 pages, 8 figure

    Hybrid Monte Carlo-Methods in Credit Risk Management

    Full text link
    In this paper we analyze and compare the use of Monte Carlo, Quasi-Monte Carlo and hybrid Monte Carlo-methods in the credit risk management system Credit Metrics by J.P.Morgan. We show that hybrid sequences used for simulations, in a suitable way, in many relevant situations, perform better than pure Monte Carlo and pure Quasi-Monte Carlo methods, and they essentially never perform worse than these methods.Comment: 18 pages, 18 figure

    Population Monte Carlo algorithms

    Full text link
    We give a cross-disciplinary survey on ``population'' Monte Carlo algorithms. In these algorithms, a set of ``walkers'' or ``particles'' is used as a representation of a high-dimensional vector. The computation is carried out by a random walk and split/deletion of these objects. The algorithms are developed in various fields in physics and statistical sciences and called by lots of different terms -- ``quantum Monte Carlo'', ``transfer-matrix Monte Carlo'', ``Monte Carlo filter (particle filter)'',``sequential Monte Carlo'' and ``PERM'' etc. Here we discuss them in a coherent framework. We also touch on related algorithms -- genetic algorithms and annealed importance sampling.Comment: Title is changed (Population-based Monte Carlo -> Population Monte Carlo). A number of small but important corrections and additions. References are also added. Original Version is read at 2000 Workshop on Information-Based Induction Sciences (July 17-18, 2000, Syuzenji, Shizuoka, Japan). No figure

    Off-diagonal Wave Function Monte Carlo Studies of Hubbard Model I

    Full text link
    We propose a Monte Carlo method, which is a hybrid method of the quantum Monte Carlo method and variational Monte Carlo theory, to study the Hubbard model. The theory is based on the off-diagonal and the Gutzwiller type correlation factors which are taken into account by a Monte Carlo algorithm. In the 4x4 system our method is able to reproduce the exact results obtained by the diagonalization. An application is given to investigate the half-filled band case of two-dimensional square lattice. The energy is favorably compared with quantum Monte Carlo data.Comment: 9 pages, 11 figure

    General Construction of Irreversible Kernel in Markov Chain Monte Carlo

    Full text link
    The Markov chain Monte Carlo update method to construct an irreversible kernel has been reviewed and extended to general state spaces. The several convergence conditions of the Markov chain were discussed. The alternative methods to the Gibbs sampler and the Metropolis-Hastings algorithm were proposed and assessed in some models. The distribution convergence and the sampling efficiency are significantly improved in the Potts model, the bivariate Gaussian model, and so on. This approach using the irreversible kernel can be applied to any Markov chain Monte Carlo sampling and it is expected to improve the efficiency in general.Comment: 16 pages, 8 figures; submitted to the proceedings of The Tenth International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing (MCQMC 2012), which will be published by Springer-Verlag, in a book entitled Monte Carlo and Quasi-Monte Carlo Methods 201

    Introduction to Monte Carlo Methods

    Full text link
    Monte Carlo methods play an important role in scientific computation, especially when problems have a vast phase space. In this lecture an introduction to the Monte Carlo method is given. Concepts such as Markov chains, detailed balance, critical slowing down, and ergodicity, as well as the Metropolis algorithm are explained. The Monte Carlo method is illustrated by numerically studying the critical behavior of the two-dimensional Ising ferromagnet using finite-size scaling methods. In addition, advanced Monte Carlo methods are described (e.g., the Wolff cluster algorithm and parallel tempering Monte Carlo) and illustrated with nontrivial models from the physics of glassy systems. Finally, we outline an approach to study rare events using a Monte Carlo sampling with a guiding function.Comment: lecture at the third international summer school "Modern Computation Science", 15 - 26 August 2011, Oldenburg (Germany), see http://www.mcs.uni-oldenburg.d

    Fast orthogonal transforms for multi-level quasi-Monte Carlo integration

    Full text link
    We combine a generic method for finding fast orthogonal transforms for a given quasi-Monte Carlo integration problem with the multilevel Monte Carlo method. It is shown by example that this combined method can vastly improve the efficiency of quasi-Monte Carlo

    Adaptive Tuning Of Hamiltonian Monte Carlo Within Sequential Monte Carlo

    Full text link
    Sequential Monte Carlo (SMC) samplers form an attractive alternative to MCMC for Bayesian computation. However, their performance depends strongly on the Markov kernels used to rejuvenate particles. We discuss how to calibrate automatically (using the current particles) Hamiltonian Monte Carlo kernels within SMC. To do so, we build upon the adaptive SMC approach of Fearnhead and Taylor (2013), and we also suggest alternative methods. We illustrate the advantages of using HMC kernels within an SMC sampler via an extensive numerical study
    • …
    corecore