153 research outputs found

    Implicit sampling for path integral control, Monte Carlo localization, and SLAM

    Get PDF
    The applicability and usefulness of implicit sampling in stochastic optimal control, stochastic localization, and simultaneous localization and mapping (SLAM), is explored; implicit sampling is a recently-developed variationally-enhanced sampling method. The theory is illustrated with examples, and it is found that implicit sampling is significantly more efficient than current Monte Carlo methods in test problems for all three applications

    Intra-hour cloud index forecasting with data assimilation

    Get PDF
    We introduce a computational framework to forecast cloud index (CI)fields for up to one hour on a spatial domain that covers a city. Such intra-hour CI forecasts are important to produce solar power forecasts of utility scale solar power and distributed rooftop solar. Our method combines a 2D advection model with cloud motion vectors (CMVs)derived from a mesoscale numerical weather prediction (NWP)model and sparse optical flow acting on successive, geostationary satellite images. We use ensemble data assimilation to combine these sources of cloud motion information based on the uncertainty of each data source. Our technique produces forecasts that have similar or lower root mean square error than reference techniques that use only optical flow, NWP CMV fields, or persistence. We describe how the method operates on three representative case studies and present results from 39 cloudy days

    MALA-within-Gibbs samplers for high-dimensional distributions with sparse conditional structure

    Get PDF
    Markov chain Monte Carlo (MCMC) samplers are numerical methods for drawing samples from a given target probability distribution. We discuss one particular MCMC sampler, the MALA-within-Gibbs sampler, from the theoretical and practical perspectives. We first show that the acceptance ratio and step size of this sampler are independent of the overall problem dimension when (i) the target distribution has sparse conditional structure, and (ii) this structure is reflected in the partial updating strategy of MALA-within-Gibbs. If, in addition, the target density is blockwise log-concave, then the sampler's convergence rate is independent of dimension. From a practical perspective, we expect that MALA-within-Gibbs is useful for solving high-dimensional Bayesian inference problems where the posterior exhibits sparse conditional structure at least approximately. In this context, a partitioning of the state that correctly reflects the sparse conditional structure must be found, and we illustrate this process in two numerical examples. We also discuss trade-offs between the block size used for partial updating and computational requirements that may increase with the number of blocks

    Small-noise analysis and symmetrization of implicit Monte Carlo samplers

    Full text link
    Implicit samplers are algorithms for producing independent, weighted samples from multi-variate probability distributions. These are often applied in Bayesian data assimilation algorithms. We use Laplace asymptotic expansions to analyze two implicit samplers in the small noise regime. Our analysis suggests a symmetrization of the algo- rithms that leads to improved (implicit) sampling schemes at a rel- atively small additional cost. Computational experiments confirm the theory and show that symmetrization is effective for small noise sampling problems

    Symmetrized importance samplers for stochastic differential equations

    Get PDF
    We study a class of importance sampling methods for stochastic differential equations (SDEs). A small-noise analysis is performed, and the results suggest that a simple symmetrization procedure can significantly improve the performance of our importance sampling schemes when the noise is not too large. We demonstrate that this is indeed the case for a number of linear and nonlinear examples. Potential applications, e.g., data assimilation, are discussed.Comment: Added brief discussion of Hamilton-Jacobi equation. Also made various minor corrections. To appear in Communciations in Applied Mathematics and Computational Scienc
    • …
    corecore