15,946 research outputs found
Polynomial Linear Programming with Gaussian Belief Propagation
Interior-point methods are state-of-the-art algorithms for solving linear
programming (LP) problems with polynomial complexity. Specifically, the
Karmarkar algorithm typically solves LP problems in time O(n^{3.5}), where
is the number of unknown variables. Karmarkar's celebrated algorithm is known
to be an instance of the log-barrier method using the Newton iteration. The
main computational overhead of this method is in inverting the Hessian matrix
of the Newton iteration. In this contribution, we propose the application of
the Gaussian belief propagation (GaBP) algorithm as part of an efficient and
distributed LP solver that exploits the sparse and symmetric structure of the
Hessian matrix and avoids the need for direct matrix inversion. This approach
shifts the computation from realm of linear algebra to that of probabilistic
inference on graphical models, thus applying GaBP as an efficient inference
engine. Our construction is general and can be used for any interior-point
algorithm which uses the Newton method, including non-linear program solvers.Comment: 7 pages, 1 figure, appeared in the 46th Annual Allerton Conference on
Communication, Control and Computing, Allerton House, Illinois, Sept. 200
A Low Density Lattice Decoder via Non-Parametric Belief Propagation
The recent work of Sommer, Feder and Shalvi presented a new family of codes
called low density lattice codes (LDLC) that can be decoded efficiently and
approach the capacity of the AWGN channel. A linear time iterative decoding
scheme which is based on a message-passing formulation on a factor graph is
given.
In the current work we report our theoretical findings regarding the relation
between the LDLC decoder and belief propagation. We show that the LDLC decoder
is an instance of non-parametric belief propagation and further connect it to
the Gaussian belief propagation algorithm. Our new results enable borrowing
knowledge from the non-parametric and Gaussian belief propagation domains into
the LDLC domain. Specifically, we give more general convergence conditions for
convergence of the LDLC decoder (under the same assumptions of the original
LDLC convergence analysis). We discuss how to extend the LDLC decoder from
Latin square to full rank, non-square matrices. We propose an efficient
construction of sparse generator matrix and its matching decoder. We report
preliminary experimental results which show our decoder has comparable symbol
to error rate compared to the original LDLC decoder.%Comment: Submitted for publicatio
Statistical physics-based reconstruction in compressed sensing
Compressed sensing is triggering a major evolution in signal acquisition. It
consists in sampling a sparse signal at low rate and later using computational
power for its exact reconstruction, so that only the necessary information is
measured. Currently used reconstruction techniques are, however, limited to
acquisition rates larger than the true density of the signal. We design a new
procedure which is able to reconstruct exactly the signal with a number of
measurements that approaches the theoretical limit in the limit of large
systems. It is based on the joint use of three essential ingredients: a
probabilistic approach to signal reconstruction, a message-passing algorithm
adapted from belief propagation, and a careful design of the measurement matrix
inspired from the theory of crystal nucleation. The performance of this new
algorithm is analyzed by statistical physics methods. The obtained improvement
is confirmed by numerical studies of several cases.Comment: 20 pages, 8 figures, 3 tables. Related codes and data are available
at http://aspics.krzakala.or
Iterative Bayesian Learning for Crowdsourced Regression
Crowdsourcing platforms emerged as popular venues for purchasing human
intelligence at low cost for large volume of tasks. As many low-paid workers
are prone to give noisy answers, a common practice is to add redundancy by
assigning multiple workers to each task and then simply average out these
answers. However, to fully harness the wisdom of the crowd, one needs to learn
the heterogeneous quality of each worker. We resolve this fundamental challenge
in crowdsourced regression tasks, i.e., the answer takes continuous labels,
where identifying good or bad workers becomes much more non-trivial compared to
a classification setting of discrete labels. In particular, we introduce a
Bayesian iterative scheme and show that it provably achieves the optimal mean
squared error. Our evaluations on synthetic and real-world datasets support our
theoretical results and show the superiority of the proposed scheme
- …