34,503 research outputs found

    On an adaptive preconditioned Crank-Nicolson MCMC algorithm for infinite dimensional Bayesian inferences

    Get PDF
    Many scientific and engineering problems require to perform Bayesian inferences for unknowns of infinite dimension. In such problems, many standard Markov Chain Monte Carlo (MCMC) algorithms become arbitrary slow under the mesh refinement, which is referred to as being dimension dependent. To this end, a family of dimensional independent MCMC algorithms, known as the preconditioned Crank-Nicolson (pCN) methods, were proposed to sample the infinite dimensional parameters. In this work we develop an adaptive version of the pCN algorithm, where the covariance operator of the proposal distribution is adjusted based on sampling history to improve the simulation efficiency. We show that the proposed algorithm satisfies an important ergodicity condition under some mild assumptions. Finally we provide numerical examples to demonstrate the performance of the proposed method

    Representation Theorems for Quadratic F{\cal F}-Consistent Nonlinear Expectations

    Get PDF
    In this paper we extend the notion of ``filtration-consistent nonlinear expectation" (or "F{\cal F}-consistent nonlinear expectation") to the case when it is allowed to be dominated by a gg-expectation that may have a quadratic growth. We show that for such a nonlinear expectation many fundamental properties of a martingale can still make sense, including the Doob-Meyer type decomposition theorem and the optional sampling theorem. More importantly, we show that any quadratic F{\cal F}-consistent nonlinear expectation with a certain domination property must be a quadratic gg-expectation. The main contribution of this paper is the finding of the domination condition to replace the one used in all the previous works, which is no longer valid in the quadratic case. We also show that the representation generator must be deterministic, continuous, and actually must be of the simple form

    A hybrid adaptive MCMC algorithm in function spaces

    Full text link
    The preconditioned Crank-Nicolson (pCN) method is a Markov Chain Monte Carlo (MCMC) scheme, specifically designed to perform Bayesian inferences in function spaces. Unlike many standard MCMC algorithms, the pCN method can preserve the sampling efficiency under the mesh refinement, a property referred to as being dimension independent. In this work we consider an adaptive strategy to further improve the efficiency of pCN. In particular we develop a hybrid adaptive MCMC method: the algorithm performs an adaptive Metropolis scheme in a chosen finite dimensional subspace, and a standard pCN algorithm in the complement space of the chosen subspace. We show that the proposed algorithm satisfies certain important ergodicity conditions. Finally with numerical examples we demonstrate that the proposed method has competitive performance with existing adaptive algorithms.Comment: arXiv admin note: text overlap with arXiv:1511.0583
    • …
    corecore