3,183 research outputs found
Optimized Realization of Bayesian Networks in Reduced Normal Form using Latent Variable Model
Bayesian networks in their Factor Graph Reduced Normal Form (FGrn) are a
powerful paradigm for implementing inference graphs. Unfortunately, the
computational and memory costs of these networks may be considerable, even for
relatively small networks, and this is one of the main reasons why these
structures have often been underused in practice. In this work, through a
detailed algorithmic and structural analysis, various solutions for cost
reduction are proposed. An online version of the classic batch learning
algorithm is also analyzed, showing very similar results (in an unsupervised
context); which is essential even if multilevel structures are to be built. The
solutions proposed, together with the possible online learning algorithm, are
included in a C++ library that is quite efficient, especially if compared to
the direct use of the well-known sum-product and Maximum Likelihood (ML)
algorithms. The results are discussed with particular reference to a Latent
Variable Model (LVM) structure.Comment: 20 pages, 8 figure
Moment-Based Variational Inference for Markov Jump Processes
We propose moment-based variational inference as a flexible framework for
approximate smoothing of latent Markov jump processes. The main ingredient of
our approach is to partition the set of all transitions of the latent process
into classes. This allows to express the Kullback-Leibler divergence between
the approximate and the exact posterior process in terms of a set of moment
functions that arise naturally from the chosen partition. To illustrate
possible choices of the partition, we consider special classes of jump
processes that frequently occur in applications. We then extend the results to
parameter inference and demonstrate the method on several examples.Comment: Accepted by the 36th International Conference on Machine Learning
(ICML 2019
Simulation based bayesian econometric inference: principles and some recent computational advances.
In this paper we discuss several aspects of simulation basedBayesian econometric inference. We start at an elementary level on basic concepts of Bayesian analysis; evaluatingintegrals by simulation methods is a crucial ingredientin Bayesian inference. Next, the most popular and well-knownsimulation techniques are discussed, the Metropolis-Hastingsalgorithm and Gibbs sampling (being the most popular Markovchain Monte Carlo methods) and importance sampling. After that, we discuss two recently developed samplingmethods: adaptive radial based direction sampling [ARDS],which makes use of a transformation to radial coordinates,and neural network sampling, which makes use of a neural network approximation to the posterior distribution ofinterest. Both methods are especially useful in cases wherethe posterior distribution is not well-behaved, in the senseof having highly non-elliptical shapes. The simulationtechniques are illustrated in several example models, suchas a model for the real US GNP and models for binary data ofa US recession indicator.
A heteroencoder architecture for prediction of failure locations in porous metals using variational inference
In this work we employ an encoder-decoder convolutional neural network to
predict the failure locations of porous metal tension specimens based only on
their initial porosities. The process we model is complex, with a progression
from initial void nucleation, to saturation, and ultimately failure. The
objective of predicting failure locations presents an extreme case of class
imbalance since most of the material in the specimens do not fail. In response
to this challenge, we develop and demonstrate the effectiveness of data- and
loss-based regularization methods. Since there is considerable sensitivity of
the failure location to the particular configuration of voids, we also use
variational inference to provide uncertainties for the neural network
predictions. We connect the deterministic and Bayesian convolutional neural
networks at a theoretical level to explain how variational inference
regularizes the training and predictions. We demonstrate that the resulting
predicted variances are effective in ranking the locations that are most likely
to fail in any given specimen.Comment: 40 pages, 12 figure
Inversion using a new low-dimensional representation of complex binary geological media based on a deep neural network
Efficient and high-fidelity prior sampling and inversion for complex
geological media is still a largely unsolved challenge. Here, we use a deep
neural network of the variational autoencoder type to construct a parametric
low-dimensional base model parameterization of complex binary geological media.
For inversion purposes, it has the attractive feature that random draws from an
uncorrelated standard normal distribution yield model realizations with spatial
characteristics that are in agreement with the training set. In comparison with
the most commonly used parametric representations in probabilistic inversion,
we find that our dimensionality reduction (DR) approach outperforms principle
component analysis (PCA), optimization-PCA (OPCA) and discrete cosine transform
(DCT) DR techniques for unconditional geostatistical simulation of a
channelized prior model. For the considered examples, important compression
ratios (200 - 500) are achieved. Given that the construction of our
parameterization requires a training set of several tens of thousands of prior
model realizations, our DR approach is more suited for probabilistic (or
deterministic) inversion than for unconditional (or point-conditioned)
geostatistical simulation. Probabilistic inversions of 2D steady-state and 3D
transient hydraulic tomography data are used to demonstrate the DR-based
inversion. For the 2D case study, the performance is superior compared to
current state-of-the-art multiple-point statistics inversion by sequential
geostatistical resampling (SGR). Inversion results for the 3D application are
also encouraging
Simulation based Bayesian econometric inference: principles and some recent computational advances
In this paper we discuss several aspects of simulation based Bayesian econometric inference. We start at an elementary level on basic concepts of Bayesian analysis; evaluating integrals by simulation methods is a crucial ingredient in Bayesian inference. Next, the most popular and well-known simulation techniques are discussed, the MetropolisHastings algorithm and Gibbs sampling (being the most popular Markov chain Monte Carlo methods) and importance sampling. After that, we discuss two recently developed sampling methods: adaptive radial based direction sampling [ARDS], which makes use of a transformation to radial coordinates, and neural network sampling, which makes use of a neural network approximation to the posterior distribution of interest. Both methods are especially useful in cases where the posterior distribution is not well-behaved, in the sense of having highly non-elliptical shapes. The simulation techniques are illustrated in several example models, such as a model for the real US GNP and models for binary data of a US recession indicator.
- …