5,806 research outputs found
On Approximate Nonlinear Gaussian Message Passing On Factor Graphs
Factor graphs have recently gained increasing attention as a unified
framework for representing and constructing algorithms for signal processing,
estimation, and control. One capability that does not seem to be well explored
within the factor graph tool kit is the ability to handle deterministic
nonlinear transformations, such as those occurring in nonlinear filtering and
smoothing problems, using tabulated message passing rules. In this
contribution, we provide general forward (filtering) and backward (smoothing)
approximate Gaussian message passing rules for deterministic nonlinear
transformation nodes in arbitrary factor graphs fulfilling a Markov property,
based on numerical quadrature procedures for the forward pass and a
Rauch-Tung-Striebel-type approximation of the backward pass. These message
passing rules can be employed for deriving many algorithms for solving
nonlinear problems using factor graphs, as is illustrated by the proposition of
a nonlinear modified Bryson-Frazier (MBF) smoother based on the presented
message passing rules
A Factor Graph Approach to Automated Design of Bayesian Signal Processing Algorithms
The benefits of automating design cycles for Bayesian inference-based
algorithms are becoming increasingly recognized by the machine learning
community. As a result, interest in probabilistic programming frameworks has
much increased over the past few years. This paper explores a specific
probabilistic programming paradigm, namely message passing in Forney-style
factor graphs (FFGs), in the context of automated design of efficient Bayesian
signal processing algorithms. To this end, we developed "ForneyLab"
(https://github.com/biaslab/ForneyLab.jl) as a Julia toolbox for message
passing-based inference in FFGs. We show by example how ForneyLab enables
automatic derivation of Bayesian signal processing algorithms, including
algorithms for parameter estimation and model comparison. Crucially, due to the
modular makeup of the FFG framework, both the model specification and inference
methods are readily extensible in ForneyLab. In order to test this framework,
we compared variational message passing as implemented by ForneyLab with
automatic differentiation variational inference (ADVI) and Monte Carlo methods
as implemented by state-of-the-art tools "Edward" and "Stan". In terms of
performance, extensibility and stability issues, ForneyLab appears to enjoy an
edge relative to its competitors for automated inference in state-space models.Comment: Accepted for publication in the International Journal of Approximate
Reasonin
Sigma Point Belief Propagation
The sigma point (SP) filter, also known as unscented Kalman filter, is an
attractive alternative to the extended Kalman filter and the particle filter.
Here, we extend the SP filter to nonsequential Bayesian inference corresponding
to loopy factor graphs. We propose sigma point belief propagation (SPBP) as a
low-complexity approximation of the belief propagation (BP) message passing
scheme. SPBP achieves approximate marginalizations of posterior distributions
corresponding to (generally) loopy factor graphs. It is well suited for
decentralized inference because of its low communication requirements. For a
decentralized, dynamic sensor localization problem, we demonstrate that SPBP
can outperform nonparametric (particle-based) BP while requiring significantly
less computations and communications.Comment: 5 pages, 1 figur
Binary Linear Classification and Feature Selection via Generalized Approximate Message Passing
For the problem of binary linear classification and feature selection, we
propose algorithmic approaches to classifier design based on the generalized
approximate message passing (GAMP) algorithm, recently proposed in the context
of compressive sensing. We are particularly motivated by problems where the
number of features greatly exceeds the number of training examples, but where
only a few features suffice for accurate classification. We show that
sum-product GAMP can be used to (approximately) minimize the classification
error rate and max-sum GAMP can be used to minimize a wide variety of
regularized loss functions. Furthermore, we describe an
expectation-maximization (EM)-based scheme to learn the associated model
parameters online, as an alternative to cross-validation, and we show that
GAMP's state-evolution framework can be used to accurately predict the
misclassification rate. Finally, we present a detailed numerical study to
confirm the accuracy, speed, and flexibility afforded by our GAMP-based
approaches to binary linear classification and feature selection
Optimal Quantization for Compressive Sensing under Message Passing Reconstruction
We consider the optimal quantization of compressive sensing measurements
following the work on generalization of relaxed belief propagation (BP) for
arbitrary measurement channels. Relaxed BP is an iterative reconstruction
scheme inspired by message passing algorithms on bipartite graphs. Its
asymptotic error performance can be accurately predicted and tracked through
the state evolution formalism. We utilize these results to design mean-square
optimal scalar quantizers for relaxed BP signal reconstruction and empirically
demonstrate the superior error performance of the resulting quantizers.Comment: 5 pages, 3 figures, submitted to IEEE International Symposium on
Information Theory (ISIT) 2011; minor corrections in v
- …