1,316 research outputs found
Gaussian Belief with dynamic data and in dynamic network
In this paper we analyse Belief Propagation over a Gaussian model in a
dynamic environment. Recently, this has been proposed as a method to average
local measurement values by a distributed protocol ("Consensus Propagation",
Moallemi & Van Roy, 2006), where the average is available for read-out at every
single node. In the case that the underlying network is constant but the values
to be averaged fluctuate ("dynamic data"), convergence and accuracy are
determined by the spectral properties of an associated Ruelle-Perron-Frobenius
operator. For Gaussian models on Erdos-Renyi graphs, numerical computation
points to a spectral gap remaining in the large-size limit, implying
exceptionally good scalability. In a model where the underlying network also
fluctuates ("dynamic network"), averaging is more effective than in the dynamic
data case. Altogether, this implies very good performance of these methods in
very large systems, and opens a new field of statistical physics of large (and
dynamic) information systems.Comment: 5 pages, 7 figure
Distributed Convergence Verification for Gaussian Belief Propagation
Gaussian belief propagation (BP) is a computationally efficient method to
approximate the marginal distribution and has been widely used for inference
with high dimensional data as well as distributed estimation in large-scale
networks. However, the convergence of Gaussian BP is still an open issue.
Though sufficient convergence conditions have been studied in the literature,
verifying these conditions requires gathering all the information over the
whole network, which defeats the main advantage of distributed computing by
using Gaussian BP. In this paper, we propose a novel sufficient convergence
condition for Gaussian BP that applies to both the pairwise linear Gaussian
model and to Gaussian Markov random fields. We show analytically that this
sufficient convergence condition can be easily verified in a distributed way
that satisfies the network topology constraint.Comment: accepted by Asilomar Conference on Signals, Systems, and Computers,
2017, Asilomar, Pacific Grove, CA. arXiv admin note: text overlap with
arXiv:1706.0407
Distributed Large Scale Network Utility Maximization
Recent work by Zymnis et al. proposes an efficient primal-dual interior-point
method, using a truncated Newton method, for solving the network utility
maximization (NUM) problem. This method has shown superior performance relative
to the traditional dual-decomposition approach. Other recent work by Bickson et
al. shows how to compute efficiently and distributively the Newton step, which
is the main computational bottleneck of the Newton method, utilizing the
Gaussian belief propagation algorithm.
In the current work, we combine both approaches to create an efficient
distributed algorithm for solving the NUM problem. Unlike the work of Zymnis,
which uses a centralized approach, our new algorithm is easily distributed.
Using an empirical evaluation we show that our new method outperforms previous
approaches, including the truncated Newton method and dual-decomposition
methods. As an additional contribution, this is the first work that evaluates
the performance of the Gaussian belief propagation algorithm vs. the
preconditioned conjugate gradient method, for a large scale problem.Comment: In the International Symposium on Information Theory (ISIT) 200
Gaussian Belief Propagation Based Multiuser Detection
In this work, we present a novel construction for solving the linear
multiuser detection problem using the Gaussian Belief Propagation algorithm.
Our algorithm yields an efficient, iterative and distributed implementation of
the MMSE detector. We compare our algorithm's performance to a recent result
and show an improved memory consumption, reduced computation steps and a
reduction in the number of sent messages. We prove that recent work by
Montanari et al. is an instance of our general algorithm, providing new
convergence results for both algorithms.Comment: 6 pages, 1 figures, appeared in the 2008 IEEE International Symposium
on Information Theory, Toronto, July 200
On the Geometry of Message Passing Algorithms for Gaussian Reciprocal Processes
Reciprocal processes are acausal generalizations of Markov processes
introduced by Bernstein in 1932. In the literature, a significant amount of
attention has been focused on developing dynamical models for reciprocal
processes. Recently, probabilistic graphical models for reciprocal processes
have been provided. This opens the way to the application of efficient
inference algorithms in the machine learning literature to solve the smoothing
problem for reciprocal processes. Such algorithms are known to converge if the
underlying graph is a tree. This is not the case for a reciprocal process,
whose associated graphical model is a single loop network. The contribution of
this paper is twofold. First, we introduce belief propagation for Gaussian
reciprocal processes. Second, we establish a link between convergence analysis
of belief propagation for Gaussian reciprocal processes and stability theory
for differentially positive systems.Comment: 15 pages; Typos corrected; This paper introduces belief propagation
for Gaussian reciprocal processes and extends the convergence analysis in
arXiv:1603.04419 to the Gaussian cas
Message-Passing Algorithms for Quadratic Minimization
Gaussian belief propagation (GaBP) is an iterative algorithm for computing
the mean of a multivariate Gaussian distribution, or equivalently, the minimum
of a multivariate positive definite quadratic function. Sufficient conditions,
such as walk-summability, that guarantee the convergence and correctness of
GaBP are known, but GaBP may fail to converge to the correct solution given an
arbitrary positive definite quadratic function. As was observed in previous
work, the GaBP algorithm fails to converge if the computation trees produced by
the algorithm are not positive definite. In this work, we will show that the
failure modes of the GaBP algorithm can be understood via graph covers, and we
prove that a parameterized generalization of the min-sum algorithm can be used
to ensure that the computation trees remain positive definite whenever the
input matrix is positive definite. We demonstrate that the resulting algorithm
is closely related to other iterative schemes for quadratic minimization such
as the Gauss-Seidel and Jacobi algorithms. Finally, we observe, empirically,
that there always exists a choice of parameters such that the above
generalization of the GaBP algorithm converges
Convergence analysis of the information matrix in Gaussian belief propagation
Gaussian belief propagation (BP) has been widely used for distributed
estimation in large-scale networks such as the smart grid, communication
networks, and social networks, where local measurements/observations are
scattered over a wide geographical area. However, the convergence of Gaus- sian
BP is still an open issue. In this paper, we consider the convergence of
Gaussian BP, focusing in particular on the convergence of the information
matrix. We show analytically that the exchanged message information matrix
converges for arbitrary positive semidefinite initial value, and its dis- tance
to the unique positive definite limit matrix decreases exponentially fast.Comment: arXiv admin note: substantial text overlap with arXiv:1611.0201
A Low Density Lattice Decoder via Non-Parametric Belief Propagation
The recent work of Sommer, Feder and Shalvi presented a new family of codes
called low density lattice codes (LDLC) that can be decoded efficiently and
approach the capacity of the AWGN channel. A linear time iterative decoding
scheme which is based on a message-passing formulation on a factor graph is
given.
In the current work we report our theoretical findings regarding the relation
between the LDLC decoder and belief propagation. We show that the LDLC decoder
is an instance of non-parametric belief propagation and further connect it to
the Gaussian belief propagation algorithm. Our new results enable borrowing
knowledge from the non-parametric and Gaussian belief propagation domains into
the LDLC domain. Specifically, we give more general convergence conditions for
convergence of the LDLC decoder (under the same assumptions of the original
LDLC convergence analysis). We discuss how to extend the LDLC decoder from
Latin square to full rank, non-square matrices. We propose an efficient
construction of sparse generator matrix and its matching decoder. We report
preliminary experimental results which show our decoder has comparable symbol
to error rate compared to the original LDLC decoder.%Comment: Submitted for publicatio
High-Dimensional Gaussian Graphical Model Selection: Walk Summability and Local Separation Criterion
We consider the problem of high-dimensional Gaussian graphical model
selection. We identify a set of graphs for which an efficient estimation
algorithm exists, and this algorithm is based on thresholding of empirical
conditional covariances. Under a set of transparent conditions, we establish
structural consistency (or sparsistency) for the proposed algorithm, when the
number of samples n=omega(J_{min}^{-2} log p), where p is the number of
variables and J_{min} is the minimum (absolute) edge potential of the graphical
model. The sufficient conditions for sparsistency are based on the notion of
walk-summability of the model and the presence of sparse local vertex
separators in the underlying graph. We also derive novel non-asymptotic
necessary conditions on the number of samples required for sparsistency
- …