7 research outputs found

    How to Discover Composition with the PC Algorithm

    Get PDF
    Some recent exchanges (Gebharter 2017a,2017b; Baumgartner and Cassini, 2023) concern whether composition can have conditional independence properties analogous to causal relations. If so, composition might sometimes be detectable by the application of causal search algorithms. The discussion has focused on a particular algorithm, PC (Spirtes and Glymour, 1991). PC is but one, and in many circumstances not the best, of a host of causal search algorithms that are candidates for methods of discovering composition provided appropriate statistical relations obtain. The discussion raises two issues: 1. Does the structure of the composition relation entail probability relations such that PC and like algorithms cannot discover composition from frequency data about kinds; and 2. what can be discovered—and how—about the composition of systems by PC or related causal search algorithms that exploit conditional independence relations. Baumgartner and Cassini answer the first question positively: constitution entails probability relations incompatible with discovery by PC. They do not engage the second question, but we will

    Determinism and the Method of Difference

    Get PDF
    The first part of this paper reveals a conflict between the core principles of deterministic causation and the standard method of difference, which is widely seen (and used) as a correct method of causally analyzing deterministic structures. We show that applying the method of difference to deterministic structures can give rise to causal inferences that contradict the principles of deterministic causation. The second part then locates the source of this conflict in an inference rule implemented in the method of difference according to which factors that can make a difference to investigated effects relative to one particular test setup are to be identified as causes, provided the causal background of the corresponding setup is homogeneous. The paper ends by modifying the method of difference in a way that renders it compatible with the principles of deterministic causation

    Causal Discovery in Linear Structural Causal Models with Deterministic Relations

    Full text link
    Linear structural causal models (SCMs) -- in which each observed variable is generated by a subset of the other observed variables as well as a subset of the exogenous sources -- are pervasive in causal inference and casual discovery. However, for the task of causal discovery, existing work almost exclusively focus on the submodel where each observed variable is associated with a distinct source with non-zero variance. This results in the restriction that no observed variable can deterministically depend on other observed variables or latent confounders. In this paper, we extend the results on structure learning by focusing on a subclass of linear SCMs which do not have this property, i.e., models in which observed variables can be causally affected by any subset of the sources, and are allowed to be a deterministic function of other observed variables or latent confounders. This allows for a more realistic modeling of influence or information propagation in systems. We focus on the task of causal discovery form observational data generated from a member of this subclass. We derive a set of necessary and sufficient conditions for unique identifiability of the causal structure. To the best of our knowledge, this is the first work that gives identifiability results for causal discovery under both latent confounding and deterministic relationships. Further, we propose an algorithm for recovering the underlying causal structure when the aforementioned conditions are satisfied. We validate our theoretical results both on synthetic and real datasets.Comment: Accepted at 1st Conference on Causal Learning and Reasoning (CLeaR 2022

    Simple low cost causal discovery using mutual information and domain knowledge

    Get PDF
    PhDThis thesis examines causal discovery within datasets, in particular observational datasets where normal experimental manipulation is not possible. A number of machine learning techniques are examined in relation to their use of knowledge and the insights they can provide regarding the situation under study. Their use of prior knowledge and the causal knowledge produced by the learners are examined. Current causal learning algorithms are discussed in terms of their strengths and limitations. The main contribution of the thesis is a new causal learner LUMIN that operates with a polynomial time complexity in both the number of variables and records examined. It makes no prior assumptions about the form of the relationships and is capable of making extensive use of available domain information. This learner is compared to a number of current learning algorithms and it is shown to be competitive with them

    Learning Bayesian networks in Semi-deterministic systems

    Full text link
    In current constraint-based (Pearl-style) systems for discovering Bayesian networks, inputs with deterministic relations are prohibited. This restricts the applicability of these systems. In this paper, we formalize a sufficient condition under which Bayesian networks can be recovered even with deterministic relations. The sufficient condition leads to an improvement to Pearl’s IC algorithm; other constraint-based algorithms can be similarly improved. The new algorithm, assuming the sufficient condition proposed, is able to recover Bayesian networks with deterministic relations, and moreover suffers no loss of performance when applied to nondeterministic Bayesian networks
    corecore