6,254 research outputs found
Marginal AMP Chain Graphs
We present a new family of models that is based on graphs that may have
undirected, directed and bidirected edges. We name these new models marginal
AMP (MAMP) chain graphs because each of them is Markov equivalent to some AMP
chain graph under marginalization of some of its nodes. However, MAMP chain
graphs do not only subsume AMP chain graphs but also multivariate regression
chain graphs. We describe global and pairwise Markov properties for MAMP chain
graphs and prove their equivalence for compositional graphoids. We also
characterize when two MAMP chain graphs are Markov equivalent.
For Gaussian probability distributions, we also show that every MAMP chain
graph is Markov equivalent to some directed and acyclic graph with
deterministic nodes under marginalization and conditioning on some of its
nodes. This is important because it implies that the independence model
represented by a MAMP chain graph can be accounted for by some data generating
process that is partially observed and has selection bias. Finally, we modify
MAMP chain graphs so that they are closed under marginalization for Gaussian
probability distributions. This is a desirable feature because it guarantees
parsimonious models under marginalization.Comment: Changes from v1 to v2: Discussion section got extended. Changes from
v2 to v3: New Sections 3 and 5. Changes from v3 to v4: Example 4 added to
discussion section. Changes from v4 to v5: None. Changes from v5 to v6: Some
minor and major errors have been corrected. The latter include the
definitions of descending route and pairwise separation base, and the proofs
of Theorems 5 and
Unifying Markov Properties for Graphical Models
Several types of graphs with different conditional independence
interpretations --- also known as Markov properties --- have been proposed and
used in graphical models. In this paper we unify these Markov properties by
introducing a class of graphs with four types of edges --- lines, arrows, arcs,
and dotted lines --- and a single separation criterion. We show that
independence structures defined by this class specialize to each of the
previously defined cases, when suitable subclasses of graphs are considered. In
addition, we define a pairwise Markov property for the subclass of chain mixed
graphs which includes chain graphs with the LWF interpretation, as well as
summary graphs (and consequently ancestral graphs). We prove the equivalence of
this pairwise Markov property to the global Markov property for compositional
graphoid independence models.Comment: 31 Pages, 6 figures, 1 tabl
Discrete chain graph models
The statistical literature discusses different types of Markov properties for
chain graphs that lead to four possible classes of chain graph Markov models.
The different models are rather well understood when the observations are
continuous and multivariate normal, and it is also known that one model class,
referred to as models of LWF (Lauritzen--Wermuth--Frydenberg) or block
concentration type, yields discrete models for categorical data that are
smooth. This paper considers the structural properties of the discrete models
based on the three alternative Markov properties. It is shown by example that
two of the alternative Markov properties can lead to non-smooth models. The
remaining model class, which can be viewed as a discrete version of
multivariate regressions, is proven to comprise only smooth models. The proof
employs a simple change of coordinates that also reveals that the model's
likelihood function is unimodal if the chain components of the graph are
complete sets.Comment: Published in at http://dx.doi.org/10.3150/08-BEJ172 the Bernoulli
(http://isi.cbs.nl/bernoulli/) by the International Statistical
Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
Sequences of regressions and their independences
Ordered sequences of univariate or multivariate regressions provide
statistical models for analysing data from randomized, possibly sequential
interventions, from cohort or multi-wave panel studies, but also from
cross-sectional or retrospective studies. Conditional independences are
captured by what we name regression graphs, provided the generated distribution
shares some properties with a joint Gaussian distribution. Regression graphs
extend purely directed, acyclic graphs by two types of undirected graph, one
type for components of joint responses and the other for components of the
context vector variable. We review the special features and the history of
regression graphs, derive criteria to read all implied independences of a
regression graph and prove criteria for Markov equivalence that is to judge
whether two different graphs imply the same set of independence statements.
Knowledge of Markov equivalence provides alternative interpretations of a given
sequence of regressions, is essential for machine learning strategies and
permits to use the simple graphical criteria of regression graphs on graphs for
which the corresponding criteria are in general more complex. Under the known
conditions that a Markov equivalent directed acyclic graph exists for any given
regression graph, we give a polynomial time algorithm to find one such graph.Comment: 43 pages with 17 figures The manuscript is to appear as an invited
discussion paper in the journal TES
Independencies Induced from a Graphical Markov Model After Marginalization and Conditioning: The R Package ggm
We describe some functions in the R package ggm to derive from a given Markov model, represented by a directed acyclic graph, different types of graphs induced after marginalizing over and conditioning on some of the variables. The package has a few basic functions that find the essential graph, the induced concentration and covariance graphs, and several types of chain graphs implied by the directed acyclic graph (DAG) after grouping and reordering the variables. These functions can be useful to explore the impact of latent variables or of selection effects on a chosen data generating model.
- …