11,550 research outputs found
Kernel Belief Propagation
We propose a nonparametric generalization of belief propagation, Kernel
Belief Propagation (KBP), for pairwise Markov random fields. Messages are
represented as functions in a reproducing kernel Hilbert space (RKHS), and
message updates are simple linear operations in the RKHS. KBP makes none of the
assumptions commonly required in classical BP algorithms: the variables need
not arise from a finite domain or a Gaussian distribution, nor must their
relations take any particular parametric form. Rather, the relations between
variables are represented implicitly, and are learned nonparametrically from
training data. KBP has the advantage that it may be used on any domain where
kernels are defined (Rd, strings, groups), even where explicit parametric
models are not known, or closed form expressions for the BP updates do not
exist. The computational cost of message updates in KBP is polynomial in the
training data size. We also propose a constant time approximate message update
procedure by representing messages using a small number of basis functions. In
experiments, we apply KBP to image denoising, depth prediction from still
images, and protein configuration prediction: KBP is faster than competing
classical and nonparametric approaches (by orders of magnitude, in some cases),
while providing significantly more accurate results
A Factor Graph Approach to Automated Design of Bayesian Signal Processing Algorithms
The benefits of automating design cycles for Bayesian inference-based
algorithms are becoming increasingly recognized by the machine learning
community. As a result, interest in probabilistic programming frameworks has
much increased over the past few years. This paper explores a specific
probabilistic programming paradigm, namely message passing in Forney-style
factor graphs (FFGs), in the context of automated design of efficient Bayesian
signal processing algorithms. To this end, we developed "ForneyLab"
(https://github.com/biaslab/ForneyLab.jl) as a Julia toolbox for message
passing-based inference in FFGs. We show by example how ForneyLab enables
automatic derivation of Bayesian signal processing algorithms, including
algorithms for parameter estimation and model comparison. Crucially, due to the
modular makeup of the FFG framework, both the model specification and inference
methods are readily extensible in ForneyLab. In order to test this framework,
we compared variational message passing as implemented by ForneyLab with
automatic differentiation variational inference (ADVI) and Monte Carlo methods
as implemented by state-of-the-art tools "Edward" and "Stan". In terms of
performance, extensibility and stability issues, ForneyLab appears to enjoy an
edge relative to its competitors for automated inference in state-space models.Comment: Accepted for publication in the International Journal of Approximate
Reasonin
A BP-MF-EP Based Iterative Receiver for Joint Phase Noise Estimation, Equalization and Decoding
In this work, with combined belief propagation (BP), mean field (MF) and
expectation propagation (EP), an iterative receiver is designed for joint phase
noise (PN) estimation, equalization and decoding in a coded communication
system. The presence of the PN results in a nonlinear observation model.
Conventionally, the nonlinear model is directly linearized by using the
first-order Taylor approximation, e.g., in the state-of-the-art soft-input
extended Kalman smoothing approach (soft-in EKS). In this work, MF is used to
handle the factor due to the nonlinear model, and a second-order Taylor
approximation is used to achieve Gaussian approximation to the MF messages,
which is crucial to the low-complexity implementation of the receiver with BP
and EP. It turns out that our approximation is more effective than the direct
linearization in the soft-in EKS with similar complexity, leading to
significant performance improvement as demonstrated by simulation results.Comment: 5 pages, 3 figures, Resubmitted to IEEE Signal Processing Letter
- …