3 research outputs found
Operationalizing Complex Causes: A Pragmatic View of Mediation
We examine the problem of causal response estimation for complex objects (e.g., text, images, genomics). In this setting, classical \emph{atomic} interventions are often not available (e.g., changes to characters, pixels, DNA base-pairs). Instead, we only have access to indirect or \emph{crude} interventions (e.g., enrolling in a writing program, modifying a scene, applying a gene therapy). In this work, we formalize this problem and provide an initial solution. Given a collection of candidate mediators, we propose (a) a two-step method for predicting the causal responses of crude interventions; and (b) a testing procedure to identify mediators of crude interventions. We demonstrate, on a range of simulated and real-world-inspired examples, that our approach allows us to efficiently estimate the effect of crude interventions with limited data from new treatment regimes
Differentiable causal backdoor discovery
Discovering the causal effect of a decision is critical to nearly all forms
of decision-making. In particular, it is a key quantity in drug development, in
crafting government policy, and when implementing a real-world machine learning
system. Given only observational data, confounders often obscure the true
causal effect. Luckily, in some cases, it is possible to recover the causal
effect by using certain observed variables to adjust for the effects of
confounders. However, without access to the true causal model, finding this
adjustment requires brute-force search. In this work, we present an algorithm
that exploits auxiliary variables, similar to instruments, in order to find an
appropriate adjustment by a gradient-based optimization method. We demonstrate
that it outperforms practical alternatives in estimating the true causal
effect, without knowledge of the full causal graph.Comment: Published in the Proceedings of the 23rd International Conference on
Artificial Intelligence and Statistics (AISTATS) 2020, Palermo, Ital
Causal Inference with Treatment Measurement Error: A Nonparametric Instrumental Variable Approach
We propose a kernel-based nonparametric estimator for the causal effect when the cause is corrupted by error. We do so by generalizing estimation in the instrumental variable setting. Despite significant work on regression with measurement error, additionally handling unobserved confounding in the continuous setting is non-trivial: we have seen little prior work. As a by-product of our investigation, we clarify a connection between mean embeddings and characteristic functions, and how learning one simultaneously allows one to learn the other. This opens the way for kernel method research to leverage existing results in characteristic function estimation. Finally, we empirically show that our proposed method, MEKIV, improves over baselines and is robust under changes in the strength of measurement error and to the type of error distributions