50,903 research outputs found

    A New Look at Survey Propagation and Its Generalizations

    Get PDF
    This article provides a new conceptual perspective on survey propagation, which is an iterative algorithm recently introduced by the statistical physics community that is very effective in solving random k-SAT problems even with densities close to the satisfiability threshold. We first describe how any SAT formula can be associated with a novel family of Markov random fields (MRFs), parameterized by a real number ρ ∈ [0, 1]. We then show that applying belief propagation---a well-known “message-passing” technique for estimating marginal probabilities---to this family of MRFs recovers a known family of algorithms, ranging from pure survey propagation at one extreme (ρ = 1) to standard belief propagation on the uniform distribution over SAT assignments at the other extreme (ρ = 0). Configurations in these MRFs have a natural interpretation as partial satisfiability assignments, on which a partial order can be defined. We isolate cores as minimal elements in this partial ordering, which are also fixed points of survey propagation and the only assignments with positive probability in the MRF for ρ = 1. Our experimental results for k = 3 suggest that solutions of random formulas typically do not possess non-trivial cores. This makes it necessary to study the structure of the space of partial assignments for ρ \u3c 1 and investigate the role of assignments that are very close to being cores. To that end, we investigate the associated lattice structure, and prove a weight-preserving identity that shows how any MRF with ρ \u3e 0 can be viewed as a “smoothed” version of the uniform distribution over satisfying assignments (ρ = 0). Finally, we isolate properties of Gibbs sampling and message-passing algorithms that are typical for an ensemble of k-SAT problems

    Counting solutions from finite samplings

    Full text link
    We formulate the solution counting problem within the framework of inverse Ising problem and use fast belief propagation equations to estimate the entropy whose value provides an estimate on the true one. We test this idea on both diluted models (random 2-SAT and 3-SAT problems) and fully-connected model (binary perceptron), and show that when the constraint density is small, this estimate can be very close to the true value. The information stored by the salamander retina under the natural movie stimuli can also be estimated and our result is consistent with that obtained by Monte Carlo method. Of particular significance is sizes of other metastable states for this real neuronal network are predicted.Comment: 9 pages, 4 figures and 1 table, further discussions adde
    corecore