66 research outputs found

    A New Approach to Probabilistic Programming Inference

    Full text link
    We introduce and demonstrate a new approach to inference in expressive probabilistic programming languages based on particle Markov chain Monte Carlo. Our approach is simple to implement and easy to parallelize. It applies to Turing-complete probabilistic programming languages and supports accurate inference in models that make use of complex control flow, including stochastic recursion. It also includes primitives from Bayesian nonparametric statistics. Our experiments show that this approach can be more efficient than previously introduced single-site Metropolis-Hastings methods.Comment: Updated version of the 2014 AISTATS paper (to reflect changes in new language syntax). 10 pages, 3 figures. Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, JMLR Workshop and Conference Proceedings, Vol 33, 201

    Core Precession and Global Modes in Granular Bulk Flow

    Get PDF
    A transition from local to global shear zones is reported for granular flows in a modified Couette cell. The experimental geometry is a slowly rotating drum which has a stationary disc of radius R_s fixed at its bottom. Granular material, which fills this cell up to height H, forms a wide shear zone which emanates from the discontinuity at the stationary discs edge. For shallow layers (H/R_s < 0.55), the shear zone reaches the free surface, with the core of the material resting on the disc and remaining stationary. In contrast, for deep layers (H/R_s > 0.55), the shear zones meet below the surface and the core starts to precess. A change in the symmetry of the surface velocities reveals that this behavior is associated with a transition from a local to a global shear mode.Comment: 4 pages, 7 figures, submitte

    Bayesian Optimization for Probabilistic Programs

    Full text link
    We present the first general purpose framework for marginal maximum a posteriori estimation of probabilistic program variables. By using a series of code transformations, the evidence of any probabilistic program, and therefore of any graphical model, can be optimized with respect to an arbitrary subset of its sampled variables. To carry out this optimization, we develop the first Bayesian optimization package to directly exploit the source code of its target, leading to innovations in problem-independent hyperpriors, unbounded optimization, and implicit constraint satisfaction; delivering significant performance improvements over prominent existing packages. We present applications of our method to a number of tasks including engineering design and parameter optimization
    corecore