9,936 research outputs found
Computational statistics using the Bayesian Inference Engine
This paper introduces the Bayesian Inference Engine (BIE), a general
parallel, optimised software package for parameter inference and model
selection. This package is motivated by the analysis needs of modern
astronomical surveys and the need to organise and reuse expensive derived data.
The BIE is the first platform for computational statistics designed explicitly
to enable Bayesian update and model comparison for astronomical problems.
Bayesian update is based on the representation of high-dimensional posterior
distributions using metric-ball-tree based kernel density estimation. Among its
algorithmic offerings, the BIE emphasises hybrid tempered MCMC schemes that
robustly sample multimodal posterior distributions in high-dimensional
parameter spaces. Moreover, the BIE is implements a full persistence or
serialisation system that stores the full byte-level image of the running
inference and previously characterised posterior distributions for later use.
Two new algorithms to compute the marginal likelihood from the posterior
distribution, developed for and implemented in the BIE, enable model comparison
for complex models and data sets. Finally, the BIE was designed to be a
collaborative platform for applying Bayesian methodology to astronomy. It
includes an extensible object-oriented and easily extended framework that
implements every aspect of the Bayesian inference. By providing a variety of
statistical algorithms for all phases of the inference problem, a scientist may
explore a variety of approaches with a single model and data implementation.
Additional technical details and download details are available from
http://www.astro.umass.edu/bie. The BIE is distributed under the GNU GPL.Comment: Resubmitted version. Additional technical details and download
details are available from http://www.astro.umass.edu/bie. The BIE is
distributed under the GNU GP
Coupled coarse graining and Markov Chain Monte Carlo for lattice systems
We propose an efficient Markov Chain Monte Carlo method for sampling
equilibrium distributions for stochastic lattice models, capable of handling
correctly long and short-range particle interactions. The proposed method is a
Metropolis-type algorithm with the proposal probability transition matrix based
on the coarse-grained approximating measures introduced in a series of works of
M. Katsoulakis, A. Majda, D. Vlachos and P. Plechac, L. Rey-Bellet and
D.Tsagkarogiannis,. We prove that the proposed algorithm reduces the
computational cost due to energy differences and has comparable mixing
properties with the classical microscopic Metropolis algorithm, controlled by
the level of coarsening and reconstruction procedure. The properties and
effectiveness of the algorithm are demonstrated with an exactly solvable
example of a one dimensional Ising-type model, comparing efficiency of the
single spin-flip Metropolis dynamics and the proposed coupled Metropolis
algorithm.Comment: 20 pages, 4 figure
Approximate Bayesian Computation by Subset Simulation
A new Approximate Bayesian Computation (ABC) algorithm for Bayesian updating
of model parameters is proposed in this paper, which combines the ABC
principles with the technique of Subset Simulation for efficient rare-event
simulation, first developed in S.K. Au and J.L. Beck [1]. It has been named
ABC- SubSim. The idea is to choose the nested decreasing sequence of regions in
Subset Simulation as the regions that correspond to increasingly closer
approximations of the actual data vector in observation space. The efficiency
of the algorithm is demonstrated in two examples that illustrate some of the
challenges faced in real-world applications of ABC. We show that the proposed
algorithm outperforms other recent sequential ABC algorithms in terms of
computational efficiency while achieving the same, or better, measure of ac-
curacy in the posterior distribution. We also show that ABC-SubSim readily
provides an estimate of the evidence (marginal likelihood) for posterior model
class assessment, as a by-product
Multilevel coarse graining and nano--pattern discovery in many particle stochastic systems
In this work we propose a hierarchy of Monte Carlo methods for sampling
equilibrium properties of stochastic lattice systems with competing short and
long range interactions. Each Monte Carlo step is composed by two or more sub -
steps efficiently coupling coarse and microscopic state spaces. The method can
be designed to sample the exact or controlled-error approximations of the
target distribution, providing information on levels of different resolutions,
as well as at the microscopic level. In both strategies the method achieves
significant reduction of the computational cost compared to conventional Markov
Chain Monte Carlo methods. Applications in phase transition and pattern
formation problems confirm the efficiency of the proposed methods.Comment: 37 page
Entrograms and coarse graining of dynamics on complex networks
Using an information theoretic point of view, we investigate how a dynamics
acting on a network can be coarse grained through the use of graph partitions.
Specifically, we are interested in how aggregating the state space of a Markov
process according to a partition impacts on the thus obtained lower-dimensional
dynamics. We highlight that for a dynamics on a particular graph there may be
multiple coarse grained descriptions that capture different, incomparable
features of the original process. For instance, a coarse graining induced by
one partition may be commensurate with a time-scale separation in the dynamics,
while another coarse graining may correspond to a different lower-dimensional
dynamics that preserves the Markov property of the original process. Taking
inspiration from the literature of Computational Mechanics, we find that a
convenient tool to summarise and visualise such dynamical properties of a
coarse grained model (partition) is the entrogram. The entrogram gathers
certain information-theoretic measures, which quantify how information flows
across time steps. These information theoretic quantities include the entropy
rate, as well as a measure for the memory contained in the process, i.e., how
well the dynamics can be approximated by a first order Markov process. We use
the entrogram to investigate how specific macro-scale connection patterns in
the state-space transition graph of the original dynamics result in desirable
properties of coarse grained descriptions. We thereby provide a fresh
perspective on the interplay between structure and dynamics in networks, and
the process of partitioning from an information theoretic perspective. We focus
on networks that may be approximated by both a core-periphery or a clustered
organization, and highlight that each of these coarse grained descriptions can
capture different aspects of a Markov process acting on the network.Comment: 17 pages, 6 figue
- …