1,459 research outputs found

    Counting and Sampling from Markov Equivalent DAGs Using Clique Trees

    Full text link
    A directed acyclic graph (DAG) is the most common graphical model for representing causal relationships among a set of variables. When restricted to using only observational data, the structure of the ground truth DAG is identifiable only up to Markov equivalence, based on conditional independence relations among the variables. Therefore, the number of DAGs equivalent to the ground truth DAG is an indicator of the causal complexity of the underlying structure--roughly speaking, it shows how many interventions or how much additional information is further needed to recover the underlying DAG. In this paper, we propose a new technique for counting the number of DAGs in a Markov equivalence class. Our approach is based on the clique tree representation of chordal graphs. We show that in the case of bounded degree graphs, the proposed algorithm is polynomial time. We further demonstrate that this technique can be utilized for uniform sampling from a Markov equivalence class, which provides a stochastic way to enumerate DAGs in the equivalence class and may be needed for finding the best DAG or for causal inference given the equivalence class as input. We also extend our counting and sampling method to the case where prior knowledge about the underlying DAG is available, and present applications of this extension in causal experiment design and estimating the causal effect of joint interventions

    Sequences of regressions and their independences

    Full text link
    Ordered sequences of univariate or multivariate regressions provide statistical models for analysing data from randomized, possibly sequential interventions, from cohort or multi-wave panel studies, but also from cross-sectional or retrospective studies. Conditional independences are captured by what we name regression graphs, provided the generated distribution shares some properties with a joint Gaussian distribution. Regression graphs extend purely directed, acyclic graphs by two types of undirected graph, one type for components of joint responses and the other for components of the context vector variable. We review the special features and the history of regression graphs, derive criteria to read all implied independences of a regression graph and prove criteria for Markov equivalence that is to judge whether two different graphs imply the same set of independence statements. Knowledge of Markov equivalence provides alternative interpretations of a given sequence of regressions, is essential for machine learning strategies and permits to use the simple graphical criteria of regression graphs on graphs for which the corresponding criteria are in general more complex. Under the known conditions that a Markov equivalent directed acyclic graph exists for any given regression graph, we give a polynomial time algorithm to find one such graph.Comment: 43 pages with 17 figures The manuscript is to appear as an invited discussion paper in the journal TES

    Graphical Modeling for High Dimensional Data

    Get PDF
    With advances in science and information technologies, many scientific fields are able to meet the challenges of managing and analyzing high-dimensional data. A so-called large p small n problem arises when the number of experimental units, n, is equal to or smaller than the number of features, p. A methodology based on probability and graph theory, termed graphical models, is applied to study the structure and inference of such high-dimensional data

    Marginal AMP Chain Graphs

    Full text link
    We present a new family of models that is based on graphs that may have undirected, directed and bidirected edges. We name these new models marginal AMP (MAMP) chain graphs because each of them is Markov equivalent to some AMP chain graph under marginalization of some of its nodes. However, MAMP chain graphs do not only subsume AMP chain graphs but also multivariate regression chain graphs. We describe global and pairwise Markov properties for MAMP chain graphs and prove their equivalence for compositional graphoids. We also characterize when two MAMP chain graphs are Markov equivalent. For Gaussian probability distributions, we also show that every MAMP chain graph is Markov equivalent to some directed and acyclic graph with deterministic nodes under marginalization and conditioning on some of its nodes. This is important because it implies that the independence model represented by a MAMP chain graph can be accounted for by some data generating process that is partially observed and has selection bias. Finally, we modify MAMP chain graphs so that they are closed under marginalization for Gaussian probability distributions. This is a desirable feature because it guarantees parsimonious models under marginalization.Comment: Changes from v1 to v2: Discussion section got extended. Changes from v2 to v3: New Sections 3 and 5. Changes from v3 to v4: Example 4 added to discussion section. Changes from v4 to v5: None. Changes from v5 to v6: Some minor and major errors have been corrected. The latter include the definitions of descending route and pairwise separation base, and the proofs of Theorems 5 and

    Algebraic Methods of Classifying Directed Graphical Models

    Full text link
    Directed acyclic graphical models (DAGs) are often used to describe common structural properties in a family of probability distributions. This paper addresses the question of classifying DAGs up to an isomorphism. By considering Gaussian densities, the question reduces to verifying equality of certain algebraic varieties. A question of computing equations for these varieties has been previously raised in the literature. Here it is shown that the most natural method adds spurious components with singular principal minors, proving a conjecture of Sullivant. This characterization is used to establish an algebraic criterion for isomorphism, and to provide a randomized algorithm for checking that criterion. Results are applied to produce a list of the isomorphism classes of tree models on 4,5, and 6 nodes. Finally, some evidence is provided to show that projectivized DAG varieties contain useful information in the sense that their relative embedding is closely related to efficient inference

    Markov properties for mixed graphs

    Get PDF
    In this paper, we unify the Markov theory of a variety of different types of graphs used in graphical Markov models by introducing the class of loopless mixed graphs, and show that all independence models induced by mm-separation on such graphs are compositional graphoids. We focus in particular on the subclass of ribbonless graphs which as special cases include undirected graphs, bidirected graphs, and directed acyclic graphs, as well as ancestral graphs and summary graphs. We define maximality of such graphs as well as a pairwise and a global Markov property. We prove that the global and pairwise Markov properties of a maximal ribbonless graph are equivalent for any independence model that is a compositional graphoid.Comment: Published in at http://dx.doi.org/10.3150/12-BEJ502 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    Independencies Induced from a Graphical Markov Model After Marginalization and Conditioning: The R Package ggm

    Get PDF
    We describe some functions in the R package ggm to derive from a given Markov model, represented by a directed acyclic graph, different types of graphs induced after marginalizing over and conditioning on some of the variables. The package has a few basic functions that find the essential graph, the induced concentration and covariance graphs, and several types of chain graphs implied by the directed acyclic graph (DAG) after grouping and reordering the variables. These functions can be useful to explore the impact of latent variables or of selection effects on a chosen data generating model.

    Properties, Learning Algorithms, and Applications of Chain Graphs and Bayesian Hypergraphs

    Get PDF
    Probabilistic graphical models (PGMs) use graphs, either undirected, directed, or mixed, to represent possible dependencies among the variables of a multivariate probability distri- bution. PGMs, such as Bayesian networks and Markov networks, are now widely accepted as a powerful and mature framework for reasoning and decision making under uncertainty in knowledge-based systems. With the increase of their popularity, the range of graphical models being investigated and used has also expanded. Several types of graphs with dif- ferent conditional independence interpretations - also known as Markov properties - have been proposed and used in graphical models. The graphical structure of a Bayesian network has the form of a directed acyclic graph (DAG), which has the advantage of supporting an interpretation of the graph in terms of cause-effect relationships. However, a limitation is that only asymmetric relationships, such as cause and effect relationships, can be modeled between variables in a DAG. Chain graphs, which admit both directed and undirected edges, can be used to overcome this limitation. Today there exist three main different interpretations of chain graphs in the lit- erature. These are the Lauritzen-Wermuth-Frydenberg, the Andersson-Madigan-Perlman, and the multivariate regression interpretations. In this thesis, we study these interpreta- tions based on their separation criteria and the intuition behind their edges. Since structure learning is a critical component in constructing an intelligent system based on a chain graph model, we propose new feasible and efficient structure learning algorithms to learn chain graphs from data under the faithfulness assumption. The proliferation of different PGMs that allow factorizations of different kinds leads us to consider a more general graphical structure in this thesis, namely directed acyclic hypergraphs. Directed acyclic hypergraphs are the graphical structure of a new proba- bilistic graphical model that we call Bayesian hypergraphs. Since there are many more hypergraphs than DAGs, undirected graphs, chain graphs, and, indeed, other graph-based networks, Bayesian hypergraphs can model much finer factorizations and thus are more computationally efficient. Bayesian hypergraphs also allow a modeler to represent causal patterns of interaction such as Noisy-OR graphically (without additional annotations). We introduce global, local and pairwise Markov properties of Bayesian hypergraphs and prove under which conditions they are equivalent. We also extend the causal interpretation of LWF chain graphs to Bayesian hypergraphs and provide corresponding formulas and a graphical criterion for intervention. The framework of graphical models, which provides algorithms for discovering and analyzing structure in complex distributions to describe them succinctly and extract un- structured information, allows them to be constructed and utilized effectively. Two of the most important applications of graphical models are causal inference and information ex- traction. To address these abilities of graphical models, we conduct a causal analysis, comparing the performance behavior of highly-configurable systems across environmen- tal conditions (changing workload, hardware, and software versions), to explore when and how causal knowledge can be commonly exploited for performance analysis
    corecore