11,448 research outputs found

    Stochastic domination for the Ising and fuzzy Potts models

    Get PDF
    We discuss various aspects concerning stochastic domination for the Ising model and the fuzzy Potts model. We begin by considering the Ising model on the homogeneous tree of degree dd, \Td. For given interaction parameters J1J_1, J2>0J_2>0 and external field h_1\in\RR, we compute the smallest external field h~\tilde{h} such that the plus measure with parameters J2J_2 and hh dominates the plus measure with parameters J1J_1 and h1h_1 for all hβ‰₯h~h\geq\tilde{h}. Moreover, we discuss continuity of h~\tilde{h} with respect to the three parameters J1J_1, J2J_2, hh and also how the plus measures are stochastically ordered in the interaction parameter for a fixed external field. Next, we consider the fuzzy Potts model and prove that on \Zd the fuzzy Potts measures dominate the same set of product measures while on \Td, for certain parameter values, the free and minus fuzzy Potts measures dominate different product measures. For the Ising model, Liggett and Steif proved that on \Zd the plus measures dominate the same set of product measures while on \T^2 that statement fails completely except when there is a unique phase.Comment: 22 pages, 5 figure

    Smoothing and filtering with a class of outer measures

    Full text link
    Filtering and smoothing with a generalised representation of uncertainty is considered. Here, uncertainty is represented using a class of outer measures. It is shown how this representation of uncertainty can be propagated using outer-measure-type versions of Markov kernels and generalised Bayesian-like update equations. This leads to a system of generalised smoothing and filtering equations where integrals are replaced by supremums and probability density functions are replaced by positive functions with supremum equal to one. Interestingly, these equations retain most of the structure found in the classical Bayesian filtering framework. It is additionally shown that the Kalman filter recursion can be recovered from weaker assumptions on the available information on the corresponding hidden Markov model
    • …
    corecore