10,607 research outputs found
Analysis of a large number of Markov chains competing for transitions
International audienceWe consider the behavior of a stochastic system composed of several identically distributed, but non independent, discrete-time absorbing Markov chains competing at each instant for a transition. The competition consists in determining at each instant, using a given probability distribution, the only Markov chain allowed to make a transition. We analyze the first time at which one of the Markov chains reaches its absorbing state. When the number of Markov chains goes to infinity, we analyze the asymptotic behavior of the system for an arbitrary probability mass function governing the competition. We give conditions for the existence of the asymptotic distribution and we show how these results apply to cluster-based distributed systems when the competition between the Markov chains is handled by using a geometric distribution
Computational statistics using the Bayesian Inference Engine
This paper introduces the Bayesian Inference Engine (BIE), a general
parallel, optimised software package for parameter inference and model
selection. This package is motivated by the analysis needs of modern
astronomical surveys and the need to organise and reuse expensive derived data.
The BIE is the first platform for computational statistics designed explicitly
to enable Bayesian update and model comparison for astronomical problems.
Bayesian update is based on the representation of high-dimensional posterior
distributions using metric-ball-tree based kernel density estimation. Among its
algorithmic offerings, the BIE emphasises hybrid tempered MCMC schemes that
robustly sample multimodal posterior distributions in high-dimensional
parameter spaces. Moreover, the BIE is implements a full persistence or
serialisation system that stores the full byte-level image of the running
inference and previously characterised posterior distributions for later use.
Two new algorithms to compute the marginal likelihood from the posterior
distribution, developed for and implemented in the BIE, enable model comparison
for complex models and data sets. Finally, the BIE was designed to be a
collaborative platform for applying Bayesian methodology to astronomy. It
includes an extensible object-oriented and easily extended framework that
implements every aspect of the Bayesian inference. By providing a variety of
statistical algorithms for all phases of the inference problem, a scientist may
explore a variety of approaches with a single model and data implementation.
Additional technical details and download details are available from
http://www.astro.umass.edu/bie. The BIE is distributed under the GNU GPL.Comment: Resubmitted version. Additional technical details and download
details are available from http://www.astro.umass.edu/bie. The BIE is
distributed under the GNU GP
Phase transitions for Quantum Markov Chains associated with Ising type models on a Cayley tree
The main aim of the present paper is to prove the existence of a phase
transition in quantum Markov chain (QMC) scheme for the Ising type models on a
Cayley tree. Note that this kind of models do not have one-dimensional
analogous, i.e. the considered model persists only on trees. In this paper, we
provide a more general construction of forward QMC. In that construction, a QMC
is defined as a weak limit of finite volume states with boundary conditions,
i.e. QMC depends on the boundary conditions. Our main result states the
existence of a phase transition for the Ising model with competing interactions
on a Cayley tree of order two. By the phase transition we mean the existence of
two distinct QMC which are not quasi-equivalent and their supports do not
overlap. We also study some algebraic property of the disordered phase of the
model, which is a new phenomena even in a classical setting.Comment: 24 pages. arXiv admin note: text overlap with arXiv:1011.225
Distributions associated with general runs and patterns in hidden Markov models
This paper gives a method for computing distributions associated with
patterns in the state sequence of a hidden Markov model, conditional on
observing all or part of the observation sequence. Probabilities are computed
for very general classes of patterns (competing patterns and generalized later
patterns), and thus, the theory includes as special cases results for a large
class of problems that have wide application. The unobserved state sequence is
assumed to be Markovian with a general order of dependence. An auxiliary Markov
chain is associated with the state sequence and is used to simplify the
computations. Two examples are given to illustrate the use of the methodology.
Whereas the first application is more to illustrate the basic steps in applying
the theory, the second is a more detailed application to DNA sequences, and
shows that the methods can be adapted to include restrictions related to
biological knowledge.Comment: Published in at http://dx.doi.org/10.1214/07-AOAS125 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
A Stochastic Approach to Shortcut Bridging in Programmable Matter
In a self-organizing particle system, an abstraction of programmable matter,
simple computational elements called particles with limited memory and
communication self-organize to solve system-wide problems of movement,
coordination, and configuration. In this paper, we consider a stochastic,
distributed, local, asynchronous algorithm for "shortcut bridging", in which
particles self-assemble bridges over gaps that simultaneously balance
minimizing the length and cost of the bridge. Army ants of the genus Eciton
have been observed exhibiting a similar behavior in their foraging trails,
dynamically adjusting their bridges to satisfy an efficiency trade-off using
local interactions. Using techniques from Markov chain analysis, we rigorously
analyze our algorithm, show it achieves a near-optimal balance between the
competing factors of path length and bridge cost, and prove that it exhibits a
dependence on the angle of the gap being "shortcut" similar to that of the ant
bridges. We also present simulation results that qualitatively compare our
algorithm with the army ant bridging behavior. Our work gives a plausible
explanation of how convergence to globally optimal configurations can be
achieved via local interactions by simple organisms (e.g., ants) with some
limited computational power and access to random bits. The proposed algorithm
also demonstrates the robustness of the stochastic approach to algorithms for
programmable matter, as it is a surprisingly simple extension of our previous
stochastic algorithm for compression.Comment: Published in Proc. of DNA23: DNA Computing and Molecular Programming
- 23rd International Conference, 2017. An updated journal version will appear
in the DNA23 Special Issue of Natural Computin
A Bayesian Approach to the Detection Problem in Gravitational Wave Astronomy
The analysis of data from gravitational wave detectors can be divided into
three phases: search, characterization, and evaluation. The evaluation of the
detection - determining whether a candidate event is astrophysical in origin or
some artifact created by instrument noise - is a crucial step in the analysis.
The on-going analyses of data from ground based detectors employ a frequentist
approach to the detection problem. A detection statistic is chosen, for which
background levels and detection efficiencies are estimated from Monte Carlo
studies. This approach frames the detection problem in terms of an infinite
collection of trials, with the actual measurement corresponding to some
realization of this hypothetical set. Here we explore an alternative, Bayesian
approach to the detection problem, that considers prior information and the
actual data in hand. Our particular focus is on the computational techniques
used to implement the Bayesian analysis. We find that the Parallel Tempered
Markov Chain Monte Carlo (PTMCMC) algorithm is able to address all three phases
of the anaylsis in a coherent framework. The signals are found by locating the
posterior modes, the model parameters are characterized by mapping out the
joint posterior distribution, and finally, the model evidence is computed by
thermodynamic integration. As a demonstration, we consider the detection
problem of selecting between models describing the data as instrument noise, or
instrument noise plus the signal from a single compact galactic binary. The
evidence ratios, or Bayes factors, computed by the PTMCMC algorithm are found
to be in close agreement with those computed using a Reversible Jump Markov
Chain Monte Carlo algorithm.Comment: 19 pages, 12 figures, revised to address referee's comment
Forecasting US bond default ratings allowing for previous and initial state dependence in an ordered probit model
In this paper we investigate the ability of a number of different ordered probit models to predict ratings based on firm-specific data on business and financial risks. We investigate models based on momentum, drift and ageing and compare them against alternatives that take into account the initial rating of the firm and its previous actual rating. Using data on US bond issuing firms rated by Fitch over the years 2000 to 2007 we compare the performance of these models in predicting the rating in-sample and out-of-sample using root mean squared errors, Diebold-Mariano tests of forecast performance and contingency tables. We conclude that initial and previous states have a substantial influence on rating prediction
Modeling sequences and temporal networks with dynamic community structures
In evolving complex systems such as air traffic and social organizations,
collective effects emerge from their many components' dynamic interactions.
While the dynamic interactions can be represented by temporal networks with
nodes and links that change over time, they remain highly complex. It is
therefore often necessary to use methods that extract the temporal networks'
large-scale dynamic community structure. However, such methods are subject to
overfitting or suffer from effects of arbitrary, a priori imposed timescales,
which should instead be extracted from data. Here we simultaneously address
both problems and develop a principled data-driven method that determines
relevant timescales and identifies patterns of dynamics that take place on
networks as well as shape the networks themselves. We base our method on an
arbitrary-order Markov chain model with community structure, and develop a
nonparametric Bayesian inference framework that identifies the simplest such
model that can explain temporal interaction data.Comment: 15 Pages, 6 figures, 2 table
- …