13,863 research outputs found
Formal analysis techniques for gossiping protocols
We give a survey of formal verification techniques that can be used to corroborate existing experimental results for gossiping protocols in a rigorous manner. We present properties of interest for gossiping protocols and discuss how various formal evaluation techniques can be employed to predict them
Cross-entropy optimisation of importance sampling parameters for statistical model checking
Statistical model checking avoids the exponential growth of states associated
with probabilistic model checking by estimating properties from multiple
executions of a system and by giving results within confidence bounds. Rare
properties are often very important but pose a particular challenge for
simulation-based approaches, hence a key objective under these circumstances is
to reduce the number and length of simulations necessary to produce a given
level of confidence. Importance sampling is a well-established technique that
achieves this, however to maintain the advantages of statistical model checking
it is necessary to find good importance sampling distributions without
considering the entire state space.
Motivated by the above, we present a simple algorithm that uses the notion of
cross-entropy to find the optimal parameters for an importance sampling
distribution. In contrast to previous work, our algorithm uses a low
dimensional vector of parameters to define this distribution and thus avoids
the often intractable explicit representation of a transition matrix. We show
that our parametrisation leads to a unique optimum and can produce many orders
of magnitude improvement in simulation efficiency. We demonstrate the efficacy
of our methodology by applying it to models from reliability engineering and
biochemistry.Comment: 16 pages, 8 figures, LNCS styl
Techniques for the Fast Simulation of Models of Highly dependable Systems
With the ever-increasing complexity and requirements of highly dependable systems, their evaluation during design and operation is becoming more crucial. Realistic models of such systems are often not amenable to analysis using conventional analytic or numerical methods. Therefore, analysts and designers turn to simulation to evaluate these models. However, accurate estimation of dependability measures of these models requires that the simulation frequently observes system failures, which are rare events in highly dependable systems. This renders ordinary Simulation impractical for evaluating such systems. To overcome this problem, simulation techniques based on importance sampling have been developed, and are very effective in certain settings. When importance sampling works well, simulation run lengths can be reduced by several orders of magnitude when estimating transient as well as steady-state dependability measures. This paper reviews some of the importance-sampling techniques that have been developed in recent years to estimate dependability measures efficiently in Markov and nonMarkov models of highly dependable system
Markov chain Monte Carlo tests for designed experiments
We consider conditional exact tests of factor effects in designed experiments
for discrete response variables. Similarly to the analysis of contingency
tables, a Markov chain Monte Carlo method can be used for performing exact
tests, when large-sample approximations are poor and the enumeration of the
conditional sample space is infeasible. For designed experiments with a single
observation for each run, we formulate log-linear or logistic models and
consider a connected Markov chain over an appropriate sample space. In
particular, we investigate fractional factorial designs with runs,
noting correspondences to the models for contingency tables
A Bayesian approach for inferring neuronal connectivity from calcium fluorescent imaging data
Deducing the structure of neural circuits is one of the central problems of
modern neuroscience. Recently-introduced calcium fluorescent imaging methods
permit experimentalists to observe network activity in large populations of
neurons, but these techniques provide only indirect observations of neural
spike trains, with limited time resolution and signal quality. In this work we
present a Bayesian approach for inferring neural circuitry given this type of
imaging data. We model the network activity in terms of a collection of coupled
hidden Markov chains, with each chain corresponding to a single neuron in the
network and the coupling between the chains reflecting the network's
connectivity matrix. We derive a Monte Carlo Expectation--Maximization
algorithm for fitting the model parameters; to obtain the sufficient statistics
in a computationally-efficient manner, we introduce a specialized
blockwise-Gibbs algorithm for sampling from the joint activity of all observed
neurons given the observed fluorescence data. We perform large-scale
simulations of randomly connected neuronal networks with biophysically
realistic parameters and find that the proposed methods can accurately infer
the connectivity in these networks given reasonable experimental and
computational constraints. In addition, the estimation accuracy may be improved
significantly by incorporating prior knowledge about the sparseness of
connectivity in the network, via standard L penalization methods.Comment: Published in at http://dx.doi.org/10.1214/09-AOAS303 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
- …