954 research outputs found
Enumeration of labelled chain graphs and labelled essential directed acyclic graphs
AbstractA chain graph is a digraph whose strong components are undirected graphs and a directed acyclic graph (ADG or DAG) G is essential if the Markov equivalence class of G consists of only one element. We provide recurrence relations for counting labelled chain graphs by the number of chain components and vertices; labelled essential DAGs by the number of vertices. The second one is a lower bound for the number of labelled essential graphs. The formula for labelled chain graphs can be extended in such a way, that allows us to count digraphs with two additional properties, which essential graphs have
Uniform random generation of large acyclic digraphs
Directed acyclic graphs are the basic representation of the structure
underlying Bayesian networks, which represent multivariate probability
distributions. In many practical applications, such as the reverse engineering
of gene regulatory networks, not only the estimation of model parameters but
the reconstruction of the structure itself is of great interest. As well as for
the assessment of different structure learning algorithms in simulation
studies, a uniform sample from the space of directed acyclic graphs is required
to evaluate the prevalence of certain structural features. Here we analyse how
to sample acyclic digraphs uniformly at random through recursive enumeration,
an approach previously thought too computationally involved. Based on
complexity considerations, we discuss in particular how the enumeration
directly provides an exact method, which avoids the convergence issues of the
alternative Markov chain methods and is actually computationally much faster.
The limiting behaviour of the distribution of acyclic digraphs then allows us
to sample arbitrarily large graphs. Building on the ideas of recursive
enumeration based sampling we also introduce a novel hybrid Markov chain with
much faster convergence than current alternatives while still being easy to
adapt to various restrictions. Finally we discuss how to include such
restrictions in the combinatorial enumeration and the new hybrid Markov chain
method for efficient uniform sampling of the corresponding graphs.Comment: 15 pages, 2 figures. To appear in Statistics and Computin
All solution graphs in multidimensional screening
We study general discrete-types multidimensional screening without any noticeable restrictions on valuations, using instead epsilon-relaxation of the incentive-compatibility constraints. Any active (becoming equality) constraint can be perceived as "envy" arc from one type to another, so the set of active constraints is a digraph. We find that: (1) any solution has an in-rooted acyclic graph ("river"); (2) for any logically feasible river there exists a screening problem resulting in such river. Using these results, any solution is characterized both through its spanning-tree and through its Lagrange multipliers, that can help in finding solutions and their efficiency/distortion properties.incentive compatibility; multidimensional screening; second-degree price discrimination; non-linear pricing; graphs
Partition MCMC for inference on acyclic digraphs
Acyclic digraphs are the underlying representation of Bayesian networks, a
widely used class of probabilistic graphical models. Learning the underlying
graph from data is a way of gaining insights about the structural properties of
a domain. Structure learning forms one of the inference challenges of
statistical graphical models.
MCMC methods, notably structure MCMC, to sample graphs from the posterior
distribution given the data are probably the only viable option for Bayesian
model averaging. Score modularity and restrictions on the number of parents of
each node allow the graphs to be grouped into larger collections, which can be
scored as a whole to improve the chain's convergence. Current examples of
algorithms taking advantage of grouping are the biased order MCMC, which acts
on the alternative space of permuted triangular matrices, and non ergodic edge
reversal moves.
Here we propose a novel algorithm, which employs the underlying combinatorial
structure of DAGs to define a new grouping. As a result convergence is improved
compared to structure MCMC, while still retaining the property of producing an
unbiased sample. Finally the method can be combined with edge reversal moves to
improve the sampler further.Comment: Revised version. 34 pages, 16 figures. R code available at
https://github.com/annlia/partitionMCM
Counting and Sampling from Markov Equivalent DAGs Using Clique Trees
A directed acyclic graph (DAG) is the most common graphical model for
representing causal relationships among a set of variables. When restricted to
using only observational data, the structure of the ground truth DAG is
identifiable only up to Markov equivalence, based on conditional independence
relations among the variables. Therefore, the number of DAGs equivalent to the
ground truth DAG is an indicator of the causal complexity of the underlying
structure--roughly speaking, it shows how many interventions or how much
additional information is further needed to recover the underlying DAG. In this
paper, we propose a new technique for counting the number of DAGs in a Markov
equivalence class. Our approach is based on the clique tree representation of
chordal graphs. We show that in the case of bounded degree graphs, the proposed
algorithm is polynomial time. We further demonstrate that this technique can be
utilized for uniform sampling from a Markov equivalence class, which provides a
stochastic way to enumerate DAGs in the equivalence class and may be needed for
finding the best DAG or for causal inference given the equivalence class as
input. We also extend our counting and sampling method to the case where prior
knowledge about the underlying DAG is available, and present applications of
this extension in causal experiment design and estimating the causal effect of
joint interventions
- …