32,546 research outputs found

    Cycle-based Cluster Variational Method for Direct and Inverse Inference

    Get PDF
    We elaborate on the idea that loop corrections to belief propagation could be dealt with in a systematic way on pairwise Markov random fields, by using the elements of a cycle basis to define region in a generalized belief propagation setting. The region graph is specified in such a way as to avoid dual loops as much as possible, by discarding redundant Lagrange multipliers, in order to facilitate the convergence, while avoiding instabilities associated to minimal factor graph construction. We end up with a two-level algorithm, where a belief propagation algorithm is run alternatively at the level of each cycle and at the inter-region level. The inverse problem of finding the couplings of a Markov random field from empirical covariances can be addressed region wise. It turns out that this can be done efficiently in particular in the Ising context, where fixed point equations can be derived along with a one-parameter log likelihood function to minimize. Numerical experiments confirm the effectiveness of these considerations both for the direct and inverse MRF inference.Comment: 47 pages, 16 figure

    A very fast inference algorithm for finite-dimensional spin glasses: Belief Propagation on the dual lattice

    Full text link
    Starting from a Cluster Variational Method, and inspired by the correctness of the paramagnetic Ansatz (at high temperatures in general, and at any temperature in the 2D Edwards-Anderson model) we propose a novel message passing algorithm --- the Dual algorithm --- to estimate the marginal probabilities of spin glasses on finite dimensional lattices. We show that in a wide range of temperatures our algorithm compares very well with Monte Carlo simulations, with the Double Loop algorithm and with exact calculation of the ground state of 2D systems with bimodal and Gaussian interactions. Moreover it is usually 100 times faster than other provably convergent methods, as the Double Loop algorithm.Comment: 23 pages, 12 figures. v2: improved introductio

    A Factor Graph Approach to Automated Design of Bayesian Signal Processing Algorithms

    Get PDF
    The benefits of automating design cycles for Bayesian inference-based algorithms are becoming increasingly recognized by the machine learning community. As a result, interest in probabilistic programming frameworks has much increased over the past few years. This paper explores a specific probabilistic programming paradigm, namely message passing in Forney-style factor graphs (FFGs), in the context of automated design of efficient Bayesian signal processing algorithms. To this end, we developed "ForneyLab" (https://github.com/biaslab/ForneyLab.jl) as a Julia toolbox for message passing-based inference in FFGs. We show by example how ForneyLab enables automatic derivation of Bayesian signal processing algorithms, including algorithms for parameter estimation and model comparison. Crucially, due to the modular makeup of the FFG framework, both the model specification and inference methods are readily extensible in ForneyLab. In order to test this framework, we compared variational message passing as implemented by ForneyLab with automatic differentiation variational inference (ADVI) and Monte Carlo methods as implemented by state-of-the-art tools "Edward" and "Stan". In terms of performance, extensibility and stability issues, ForneyLab appears to enjoy an edge relative to its competitors for automated inference in state-space models.Comment: Accepted for publication in the International Journal of Approximate Reasonin
    corecore