170 research outputs found
Formal Definitions of Conservative PDFs
Under ideal conditions, the probability density function (PDF) of a random
variable, such as a sensor measurement, would be well known and amenable to
computation and communication tasks. However, this is often not the case, so
the user looks for some other PDF that approximates the true but intractable
PDF. Conservativeness is a commonly sought property of this approximating PDF,
especially in distributed or unstructured data systems where the data being
fused may contain un-known correlations. Roughly, a conservative approximation
is one that overestimates the uncertainty of a system. While prior work has
introduced some definitions of conservativeness, these definitions either apply
only to normal distributions or violate some of the intuitive appeal of
(Gaussian) conservative definitions. This work provides a general and intuitive
definition of conservativeness that is applicable to any probability
distribution, including multi-modal and uniform distributions. Unfortunately,
we show that this \emph{strong} definition of conservative cannot be used to
evaluate data fusion techniques. Therefore, we also describe a weaker
definition of conservative and show it is preserved through common data fusion
methods such as the linear and log-linear opinion pool, and homogeneous
functionals. In addition, we show that after fusion, weak conservativeness is
preserved by Bayesian updates. These strong and weak definitions of
conservativeness can help design and evaluate potential correlation-agnostic
data fusion techniques
Arithmetic Average Density Fusion -- Part I: Some Statistic and Information-theoretic Results
Finite mixture such as the Gaussian mixture is a flexible and powerful
probabilistic modeling tool for representing the multimodal distribution widely
involved in many estimation and learning problems. The core of it is
representing the target distribution by the arithmetic average (AA) of a finite
number of sub-distributions which constitute a mixture. While the mixture has
been widely used for single sensor filter design, it is only recent that the AA
fusion demonstrates compelling performance for multi-sensor filter design. In
this paper, some statistic and information-theoretic results are given on the
covariance consistency, mean square error, mode-preservation capacity, and the
information divergence of the AA fusion approach. In particular, based on the
concept of conservative fusion, the relationship of the AA fusion with the
existing conservative fusion approaches such as covariance union and covariance
intersection is exposed. A suboptimal weighting approach has been proposed,
which jointly with the best mixture-fit property of the AA fusion leads to a
max-min optimization problem. Linear Gaussian models are considered for
algorithm illustration and simulation comparison, resulting in the first-ever
AA fusion-based multi-sensor Kalman filter.Comment: 30 pages, 14 figures, 3 tables. Information Fusion, 202
Minimum information loss fusion in distributed sensor networks
A key assumption of distributed data fusion is
that individual nodes have no knowledge of the global network
topology and use only information which is available locally.
This paper considers the weighted exponential product (WEP)
rule as a methodology for conservatively fusing estimates with
an unknown degree of correlation between them. We provide a
preliminary investigation into how the methodology for selecting
the mixing parameter can be used to minimize the information
loss in the fused covariance as opposed to reducing the Shannon
entropy, and hence maximize the information of the fused
covariance. Our results suggest that selecting a mixing parameter
which minimizes the information loss ensures that information
which is exclusive to the estimates from one source is not lost
during the fusion process. These results indicate that minimizing
the information loss provides a robust technique for selecting the
mixing parameter in WEP fusion
A probabilistic interpretation of set-membership filtering: application to polynomial systems through polytopic bounding
Set-membership estimation is usually formulated in the context of set-valued
calculus and no probabilistic calculations are necessary. In this paper, we
show that set-membership estimation can be equivalently formulated in the
probabilistic setting by employing sets of probability measures. Inference in
set-membership estimation is thus carried out by computing expectations with
respect to the updated set of probability measures P as in the probabilistic
case. In particular, it is shown that inference can be performed by solving a
particular semi-infinite linear programming problem, which is a special case of
the truncated moment problem in which only the zero-th order moment is known
(i.e., the support). By writing the dual of the above semi-infinite linear
programming problem, it is shown that, if the nonlinearities in the measurement
and process equations are polynomial and if the bounding sets for initial
state, process and measurement noises are described by polynomial inequalities,
then an approximation of this semi-infinite linear programming problem can
efficiently be obtained by using the theory of sum-of-squares polynomial
optimization. We then derive a smart greedy procedure to compute a polytopic
outer-approximation of the true membership-set, by computing the minimum-volume
polytope that outer-bounds the set that includes all the means computed with
respect to P
A Bayesian Framework to Constrain the Photon Mass with a Catalog of Fast Radio Bursts
A hypothetical photon mass, , gives an energy-dependent light speed
in a Lorentz-invariant theory. Such a modification causes an additional time
delay between photons of different energies when they travel through a fixed
distance. Fast radio bursts (FRBs), with their short time duration and
cosmological propagation distance, are excellent astrophysical objects to
constrain . Here for the first time we develop a Bayesian framework
to study this problem with a catalog of FRBs. Those FRBs with and without
redshift measurement are both useful in this framework, and can be combined in
a Bayesian way. A catalog of 21 FRBs (including 20 FRBs without redshift
measurement, and one, FRB 121102, with a measured redshift ) give a combined limit ,
or equivalently (, or equivalently ) at 68% (95%) confidence level, which represents the
best limit that comes purely from kinematics. The framework proposed here will
be valuable when FRBs are observed daily in the future. Increment in the number
of FRBs, and refinement in the knowledge about the electron distributions in
the Milky Way, the host galaxies of FRBs, and the intergalactic median, will
further tighten the constraint.Comment: 10 pages, 6 figures; Physical Review D, in pres
Probabilistic Traversability Model for Risk-Aware Motion Planning in Off-Road Environments
A key challenge in off-road navigation is that even visually similar terrains
or ones from the same semantic class may have substantially different traction
properties. Existing work typically assumes no wheel slip or uses the expected
traction for motion planning, where the predicted trajectories provide a poor
indication of the actual performance if the terrain traction has high
uncertainty. In contrast, this work proposes to analyze terrain traversability
with the empirical distribution of traction parameters in unicycle dynamics,
which can be learned by a neural network in a self-supervised fashion. The
probabilistic traction model leads to two risk-aware cost formulations that
account for the worst-case expected cost and traction. To help the learned
model generalize to unseen environment, terrains with features that lead to
unreliable predictions are detected via a density estimator fit to the trained
network's latent space and avoided via auxiliary penalties during planning.
Simulation results demonstrate that the proposed approach outperforms existing
work that assumes no slip or uses the expected traction in both navigation
success rate and completion time. Furthermore, avoiding terrains with low
density-based confidence score achieves up to 30% improvement in success rate
when the learned traction model is used in a novel environment.Comment: To appear in IROS23. Video and code:
https://github.com/mit-acl/mppi_numb
- …