81,717 research outputs found

    Cluster Variation Method in Statistical Physics and Probabilistic Graphical Models

    Full text link
    The cluster variation method (CVM) is a hierarchy of approximate variational techniques for discrete (Ising--like) models in equilibrium statistical mechanics, improving on the mean--field approximation and the Bethe--Peierls approximation, which can be regarded as the lowest level of the CVM. In recent years it has been applied both in statistical physics and to inference and optimization problems formulated in terms of probabilistic graphical models. The foundations of the CVM are briefly reviewed, and the relations with similar techniques are discussed. The main properties of the method are considered, with emphasis on its exactness for particular models and on its asymptotic properties. The problem of the minimization of the variational free energy, which arises in the CVM, is also addressed, and recent results about both provably convergent and message-passing algorithms are discussed.Comment: 36 pages, 17 figure

    Approximating Subdense Instances of Covering Problems

    Full text link
    We study approximability of subdense instances of various covering problems on graphs, defined as instances in which the minimum or average degree is Omega(n/psi(n)) for some function psi(n)=omega(1) of the instance size. We design new approximation algorithms as well as new polynomial time approximation schemes (PTASs) for those problems and establish first approximation hardness results for them. Interestingly, in some cases we were able to prove optimality of the underlying approximation ratios, under usual complexity-theoretic assumptions. Our results for the Vertex Cover problem depend on an improved recursive sampling method which could be of independent interest

    Cycle-based Cluster Variational Method for Direct and Inverse Inference

    Get PDF
    We elaborate on the idea that loop corrections to belief propagation could be dealt with in a systematic way on pairwise Markov random fields, by using the elements of a cycle basis to define region in a generalized belief propagation setting. The region graph is specified in such a way as to avoid dual loops as much as possible, by discarding redundant Lagrange multipliers, in order to facilitate the convergence, while avoiding instabilities associated to minimal factor graph construction. We end up with a two-level algorithm, where a belief propagation algorithm is run alternatively at the level of each cycle and at the inter-region level. The inverse problem of finding the couplings of a Markov random field from empirical covariances can be addressed region wise. It turns out that this can be done efficiently in particular in the Ising context, where fixed point equations can be derived along with a one-parameter log likelihood function to minimize. Numerical experiments confirm the effectiveness of these considerations both for the direct and inverse MRF inference.Comment: 47 pages, 16 figure

    Fast Distributed Approximation for Max-Cut

    Full text link
    Finding a maximum cut is a fundamental task in many computational settings. Surprisingly, it has been insufficiently studied in the classic distributed settings, where vertices communicate by synchronously sending messages to their neighbors according to the underlying graph, known as the LOCAL\mathcal{LOCAL} or CONGEST\mathcal{CONGEST} models. We amend this by obtaining almost optimal algorithms for Max-Cut on a wide class of graphs in these models. In particular, for any ϵ>0\epsilon > 0, we develop randomized approximation algorithms achieving a ratio of (1−ϵ)(1-\epsilon) to the optimum for Max-Cut on bipartite graphs in the CONGEST\mathcal{CONGEST} model, and on general graphs in the LOCAL\mathcal{LOCAL} model. We further present efficient deterministic algorithms, including a 1/31/3-approximation for Max-Dicut in our models, thus improving the best known (randomized) ratio of 1/41/4. Our algorithms make non-trivial use of the greedy approach of Buchbinder et al. (SIAM Journal on Computing, 2015) for maximizing an unconstrained (non-monotone) submodular function, which may be of independent interest
    • …
    corecore