132,426 research outputs found
Efficient and exact sampling of simple graphs with given arbitrary degree sequence
Uniform sampling from graphical realizations of a given degree sequence is a
fundamental component in simulation-based measurements of network observables,
with applications ranging from epidemics, through social networks to Internet
modeling. Existing graph sampling methods are either link-swap based
(Markov-Chain Monte Carlo algorithms) or stub-matching based (the Configuration
Model). Both types are ill-controlled, with typically unknown mixing times for
link-swap methods and uncontrolled rejections for the Configuration Model. Here
we propose an efficient, polynomial time algorithm that generates statistically
independent graph samples with a given, arbitrary, degree sequence. The
algorithm provides a weight associated with each sample, allowing the
observable to be measured either uniformly over the graph ensemble, or,
alternatively, with a desired distribution. Unlike other algorithms, this
method always produces a sample, without back-tracking or rejections. Using a
central limit theorem-based reasoning, we argue, that for large N, and for
degree sequences admitting many realizations, the sample weights are expected
to have a lognormal distribution. As examples, we apply our algorithm to
generate networks with degree sequences drawn from power-law distributions and
from binomial distributions.Comment: 8 pages, 3 figure
Balancing Scalability and Uniformity in SAT Witness Generator
Constrained-random simulation is the predominant approach used in the
industry for functional verification of complex digital designs. The
effectiveness of this approach depends on two key factors: the quality of
constraints used to generate test vectors, and the randomness of solutions
generated from a given set of constraints. In this paper, we focus on the
second problem, and present an algorithm that significantly improves the
state-of-the-art of (almost-)uniform generation of solutions of large Boolean
constraints. Our algorithm provides strong theoretical guarantees on the
uniformity of generated solutions and scales to problems involving hundreds of
thousands of variables.Comment: This is a full version of DAC 2014 pape
Flexible constrained sampling with guarantees for pattern mining
Pattern sampling has been proposed as a potential solution to the infamous
pattern explosion. Instead of enumerating all patterns that satisfy the
constraints, individual patterns are sampled proportional to a given quality
measure. Several sampling algorithms have been proposed, but each of them has
its limitations when it comes to 1) flexibility in terms of quality measures
and constraints that can be used, and/or 2) guarantees with respect to sampling
accuracy. We therefore present Flexics, the first flexible pattern sampler that
supports a broad class of quality measures and constraints, while providing
strong guarantees regarding sampling accuracy. To achieve this, we leverage the
perspective on pattern mining as a constraint satisfaction problem and build
upon the latest advances in sampling solutions in SAT as well as existing
pattern mining algorithms. Furthermore, the proposed algorithm is applicable to
a variety of pattern languages, which allows us to introduce and tackle the
novel task of sampling sets of patterns. We introduce and empirically evaluate
two variants of Flexics: 1) a generic variant that addresses the well-known
itemset sampling task and the novel pattern set sampling task as well as a wide
range of expressive constraints within these tasks, and 2) a specialized
variant that exploits existing frequent itemset techniques to achieve
substantial speed-ups. Experiments show that Flexics is both accurate and
efficient, making it a useful tool for pattern-based data exploration.Comment: Accepted for publication in Data Mining & Knowledge Discovery journal
(ECML/PKDD 2017 journal track
Diffusive Nested Sampling
We introduce a general Monte Carlo method based on Nested Sampling (NS), for
sampling complex probability distributions and estimating the normalising
constant. The method uses one or more particles, which explore a mixture of
nested probability distributions, each successive distribution occupying ~e^-1
times the enclosed prior mass of the previous distribution. While NS
technically requires independent generation of particles, Markov Chain Monte
Carlo (MCMC) exploration fits naturally into this technique. We illustrate the
new method on a test problem and find that it can achieve four times the
accuracy of classic MCMC-based Nested Sampling, for the same computational
effort; equivalent to a factor of 16 speedup. An additional benefit is that
more samples and a more accurate evidence value can be obtained simply by
continuing the run for longer, as in standard MCMC.Comment: Accepted for publication in Statistics and Computing. C++ code
available at http://lindor.physics.ucsb.edu/DNes
- …