1,578 research outputs found

    Loop-corrected belief propagation for lattice spin models

    Full text link
    Belief propagation (BP) is a message-passing method for solving probabilistic graphical models. It is very successful in treating disordered models (such as spin glasses) on random graphs. On the other hand, finite-dimensional lattice models have an abundant number of short loops, and the BP method is still far from being satisfactory in treating the complicated loop-induced correlations in these systems. Here we propose a loop-corrected BP method to take into account the effect of short loops in lattice spin models. We demonstrate, through an application to the square-lattice Ising model, that loop-corrected BP improves over the naive BP method significantly. We also implement loop-corrected BP at the coarse-grained region graph level to further boost its performance.Comment: 11 pages, minor changes with new references added. Final version as published in EPJ

    Loop corrections in spin models through density consistency

    Get PDF
    Computing marginal distributions of discrete or semidiscrete Markov random fields (MRFs) is a fundamental, generally intractable problem with a vast number of applications in virtually all fields of science. We present a new family of computational schemes to approximately calculate the marginals of discrete MRFs. This method shares some desirable properties with belief propagation, in particular, providing exact marginals on acyclic graphs, but it differs with the latter in that it includes some loop corrections; i.e., it takes into account correlations coming from all cycles in the factor graph. It is also similar to the adaptive Thouless-Anderson-Palmer method, but it differs with the latter in that the consistency is not on the first two moments of the distribution but rather on the value of its density on a subset of values. The results on finite-dimensional Isinglike models show a significant improvement with respect to the Bethe-Peierls (tree) approximation in all cases and with respect to the plaquette cluster variational method approximation in many cases. In particular, for the critical inverse temperature βc\beta_{c} of the homogeneous hypercubic lattice, the expansion of (dβc)−1\left(d\beta_{c}\right)^{-1} around d=∞d=\infty of the proposed scheme is exact up to the d−4d^{-4} order, whereas the two latter are exact only up to the d−2d^{-2} order.Comment: 12 pages, 3 figures, 1 tabl

    Truncating the loop series expansion for Belief Propagation

    Full text link
    Recently, M. Chertkov and V.Y. Chernyak derived an exact expression for the partition sum (normalization constant) corresponding to a graphical model, which is an expansion around the Belief Propagation solution. By adding correction terms to the BP free energy, one for each "generalized loop" in the factor graph, the exact partition sum is obtained. However, the usually enormous number of generalized loops generally prohibits summation over all correction terms. In this article we introduce Truncated Loop Series BP (TLSBP), a particular way of truncating the loop series of M. Chertkov and V.Y. Chernyak by considering generalized loops as compositions of simple loops. We analyze the performance of TLSBP in different scenarios, including the Ising model, regular random graphs and on Promedas, a large probabilistic medical diagnostic system. We show that TLSBP often improves upon the accuracy of the BP solution, at the expense of increased computation time. We also show that the performance of TLSBP strongly depends on the degree of interaction between the variables. For weak interactions, truncating the series leads to significant improvements, whereas for strong interactions it can be ineffective, even if a high number of terms is considered.Comment: 31 pages, 12 figures, submitted to Journal of Machine Learning Researc

    Cycle-based Cluster Variational Method for Direct and Inverse Inference

    Get PDF
    We elaborate on the idea that loop corrections to belief propagation could be dealt with in a systematic way on pairwise Markov random fields, by using the elements of a cycle basis to define region in a generalized belief propagation setting. The region graph is specified in such a way as to avoid dual loops as much as possible, by discarding redundant Lagrange multipliers, in order to facilitate the convergence, while avoiding instabilities associated to minimal factor graph construction. We end up with a two-level algorithm, where a belief propagation algorithm is run alternatively at the level of each cycle and at the inter-region level. The inverse problem of finding the couplings of a Markov random field from empirical covariances can be addressed region wise. It turns out that this can be done efficiently in particular in the Ising context, where fixed point equations can be derived along with a one-parameter log likelihood function to minimize. Numerical experiments confirm the effectiveness of these considerations both for the direct and inverse MRF inference.Comment: 47 pages, 16 figure

    Gauge-free cluster variational method by maximal messages and moment matching

    Get PDF
    We present an implementation of the cluster variational method (CVM) as a message passing algorithm. The kind of message passing algorithm used for CVM, usually named generalized belief propagation (GBP), is a generalization of the belief propagation algorithm in the same way that CVM is a generalization of the Bethe approximation for estimating the partition function. However, the connection between fixed points of GBP and the extremal points of the CVM free energy is usually not a one-to-one correspondence because of the existence of a gauge transformation involving the GBP messages. Our contribution is twofold. First, we propose a way of defining messages (fields) in a generic CVM approximation, such that messages arrive on a given region from all its ancestors, and not only from its direct parents, as in the standard parent-to-child GBP. We call this approach maximal messages. Second, we focus on the case of binary variables, reinterpreting the messages as fields enforcing the consistency between the moments of the local (marginal) probability distributions. We provide a precise rule to enforce all consistencies, avoiding any redundancy, that would otherwise lead to a gauge transformation on the messages. This moment matching method is gauge free, i.e., it guarantees that the resulting GBP is not gauge invariant. We apply our maximal messages and moment matching GBP to obtain an analytical expression for the critical temperature of the Ising model in general dimensions at the level of plaquette CVM. The values obtained outperform Bethe estimates, and are comparable with loop corrected belief propagation equations. The method allows for a straightforward generalization to disordered systems

    Approximate inference on graphical models: message-passing, loop-corrected methods and applications

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen
    • …
    corecore