3 research outputs found

    Fast Automatic Bayesian Cubature Using Matching Kernels and Designs

    Full text link
    Automatic cubatures approximate integrals to user-specified error tolerances. For high dimensional problems, it is difficult to adaptively change the sampling pattern to focus on peaks because peaks can hide more easily in high dimensional space. But, one can automatically determine the sample size, nn, given a reasonable, fixed sampling pattern. This approach is pursued in Jagadeeswaran and Hickernell, Stat.\ Comput., 29:1214-1229, 2019, where a Bayesian perspective is used to construct a credible interval for the integral, and the computation is terminated when the half-width of the interval is no greater than the required error tolerance. Our earlier work employs integration lattice sampling, and the computations are expedited by the fast Fourier transform because the covariance kernels for the Gaussian process prior on the integrand are chosen to be shift-invariant. In this chapter, we extend our fast automatic Bayesian cubature to digital net sampling via \emph{digitally} shift-invariant covariance kernels and fast Walsh transforms. Our algorithm is implemented in the MATLAB Guaranteed Automatic Integration Library (GAIL) and the QMCPy Python library.Comment: PhD thesi

    On Bounding and Approximating Functions of Multiple Expectations using Quasi-Monte Carlo

    Full text link
    Monte Carlo and Quasi-Monte Carlo methods present a convenient approach for approximating the expected value of a random variable. Algorithms exist to adaptively sample the random variable until a user defined absolute error tolerance is satisfied with high probability. This work describes an extension of such methods which supports adaptive sampling to satisfy general error criteria for functions of a common array of expectations. Although several functions involving multiple expectations are being evaluated, only one random sequence is required, albeit sometimes of larger dimension than the underlying randomness. These enhanced Monte Carlo and Quasi-Monte Carlo algorithms are implemented in the QMCPy Python package with support for economic and parallel function evaluation. We exemplify these capabilities on problems from machine learning and global sensitivity analysis

    Challenges in Developing Great Quasi-Monte Carlo Software

    Full text link
    Quasi-Monte Carlo (QMC) methods have developed over several decades. With the explosion in computational science, there is a need for great software that implements QMC algorithms. We summarize the QMC software that has been developed to date, propose some criteria for developing great QMC software, and suggest some steps toward achieving great software. We illustrate these criteria and steps with the Quasi-Monte Carlo Python library (QMCPy), an open-source community software framework, extensible by design with common programming interfaces to an increasing number of existing or emerging QMC libraries developed by the greater community of QMC researchers
    corecore