1 research outputs found
High Performance Financial Simulation Using Randomized Quasi-Monte Carlo Methods
GPU computing has become popular in computational finance and many financial
institutions are moving their CPU based applications to the GPU platform. Since
most Monte Carlo algorithms are embarrassingly parallel, they benefit greatly
from parallel implementations, and consequently Monte Carlo has become a focal
point in GPU computing. GPU speed-up examples reported in the literature often
involve Monte Carlo algorithms, and there are software tools commercially
available that help migrate Monte Carlo financial pricing models to GPU.
We present a survey of Monte Carlo and randomized quasi-Monte Carlo methods,
and discuss existing (quasi) Monte Carlo sequences in GPU libraries. We discuss
specific features of GPU architecture relevant for developing efficient (quasi)
Monte Carlo methods. We introduce a recent randomized quasi-Monte Carlo method,
and compare it with some of the existing implementations on GPU, when they are
used in pricing caplets in the LIBOR market model and mortgage backed
securities