13,732 research outputs found

    Quantum Belief Propagation

    Full text link
    We present an accurate numerical algorithm, called quantum belief propagation (QBP), for simulation of one-dimensional quantum systems at non-zero temperature. The algorithm exploits the fact that quantum effects are short-range in these systems at non-zero temperature, decaying on a length scale inversely proportional to the temperature. We compare to exact results on a spin-1/2 Heisenberg chain. Even a very modest calculation, requiring diagonalizing only 10-by-10 matrices, reproduces the peak susceptibility with a relative error of less than 10−510^{-5}, while more elaborate calculations further reduce the error.Comment: 4 pages, 1 figure; revised time estimates due to improved implementation. Typographical corrections to Eq. 7 made; thanks to David Poulin for pointing out the mistak

    Discriminated Belief Propagation

    Full text link
    Near optimal decoding of good error control codes is generally a difficult task. However, for a certain type of (sufficiently) good codes an efficient decoding algorithm with near optimal performance exists. These codes are defined via a combination of constituent codes with low complexity trellis representations. Their decoding algorithm is an instance of (loopy) belief propagation and is based on an iterative transfer of constituent beliefs. The beliefs are thereby given by the symbol probabilities computed in the constituent trellises. Even though weak constituent codes are employed close to optimal performance is obtained, i.e., the encoder/decoder pair (almost) achieves the information theoretic capacity. However, (loopy) belief propagation only performs well for a rather specific set of codes, which limits its applicability. In this paper a generalisation of iterative decoding is presented. It is proposed to transfer more values than just the constituent beliefs. This is achieved by the transfer of beliefs obtained by independently investigating parts of the code space. This leads to the concept of discriminators, which are used to improve the decoder resolution within certain areas and defines discriminated symbol beliefs. It is shown that these beliefs approximate the overall symbol probabilities. This leads to an iteration rule that (below channel capacity) typically only admits the solution of the overall decoding problem. Via a Gauss approximation a low complexity version of this algorithm is derived. Moreover, the approach may then be applied to a wide range of channel maps without significant complexity increase

    Kernel Belief Propagation

    Full text link
    We propose a nonparametric generalization of belief propagation, Kernel Belief Propagation (KBP), for pairwise Markov random fields. Messages are represented as functions in a reproducing kernel Hilbert space (RKHS), and message updates are simple linear operations in the RKHS. KBP makes none of the assumptions commonly required in classical BP algorithms: the variables need not arise from a finite domain or a Gaussian distribution, nor must their relations take any particular parametric form. Rather, the relations between variables are represented implicitly, and are learned nonparametrically from training data. KBP has the advantage that it may be used on any domain where kernels are defined (Rd, strings, groups), even where explicit parametric models are not known, or closed form expressions for the BP updates do not exist. The computational cost of message updates in KBP is polynomial in the training data size. We also propose a constant time approximate message update procedure by representing messages using a small number of basis functions. In experiments, we apply KBP to image denoising, depth prediction from still images, and protein configuration prediction: KBP is faster than competing classical and nonparametric approaches (by orders of magnitude, in some cases), while providing significantly more accurate results

    Sigma Point Belief Propagation

    Full text link
    The sigma point (SP) filter, also known as unscented Kalman filter, is an attractive alternative to the extended Kalman filter and the particle filter. Here, we extend the SP filter to nonsequential Bayesian inference corresponding to loopy factor graphs. We propose sigma point belief propagation (SPBP) as a low-complexity approximation of the belief propagation (BP) message passing scheme. SPBP achieves approximate marginalizations of posterior distributions corresponding to (generally) loopy factor graphs. It is well suited for decentralized inference because of its low communication requirements. For a decentralized, dynamic sensor localization problem, we demonstrate that SPBP can outperform nonparametric (particle-based) BP while requiring significantly less computations and communications.Comment: 5 pages, 1 figur

    Belief Propagation Min-Sum Algorithm for Generalized Min-Cost Network Flow

    Full text link
    Belief Propagation algorithms are instruments used broadly to solve graphical model optimization and statistical inference problems. In the general case of a loopy Graphical Model, Belief Propagation is a heuristic which is quite successful in practice, even though its empirical success, typically, lacks theoretical guarantees. This paper extends the short list of special cases where correctness and/or convergence of a Belief Propagation algorithm is proven. We generalize formulation of Min-Sum Network Flow problem by relaxing the flow conservation (balance) constraints and then proving that the Belief Propagation algorithm converges to the exact result

    Belief propagation in monoidal categories

    Full text link
    We discuss a categorical version of the celebrated belief propagation algorithm. This provides a way to prove that some algorithms which are known or suspected to be analogous, are actually identical when formulated generically. It also highlights the computational point of view in monoidal categories.Comment: In Proceedings QPL 2014, arXiv:1412.810
    • …
    corecore