10,050 research outputs found

    On conditional probabilities and their canonical extensions to Boolean algebras of compound conditionals

    Get PDF
    In this paper we investigate canonical extensions of conditional probabilities to Boolean algebras of conditionals. Before entering into the probabilistic setting, we first prove that the lattice order relation of every Boolean algebra of conditionals can be characterized in terms of the well-known order relation given by Goodman and Nguyen. Then, as an interesting methodological tool, we show that canonical extensions behave well with respect to conditional subalgebras. As a consequence, we prove that a canonical extension and its original conditional probability agree on basic conditionals. Moreover, we verify that the probability of conjunctions and disjunctions of conditionals in a recently introduced framework of Boolean algebras of conditionals are in full agreement with the corresponding operations of conditionals as defined in the approach developed by two of the authors to conditionals as three-valued objects, with betting-based semantics, and specified as suitable random quantities. Finally we discuss relations of our approach with nonmonotonic reasoning based on an entailment relation among conditionals

    Locally Adaptive Function Estimation for Binary Regression Models

    Get PDF
    In this paper we present a nonparametric Bayesian approach for fitting unsmooth or highly oscillating functions in regression models with binary responses. The approach extends previous work by Lang et al. (2002) for Gaussian responses. Nonlinear functions are modelled by first or second order random walk priors with locally varying variances or smoothing parameters. Estimation is fully Bayesian and uses latent utility representations of binary regression models for efficient block sampling from the full conditionals of nonlinear functions

    Inferring Causal Directions by Evaluating the Complexity of Conditional Distributions

    Get PDF
    We propose a new approach to infer the causal structure that has generated the observed statistical dependences among n random variables. The idea is that the factorization of the joint measure of cause and effect into P(cause)P(effect|cause) leads typically to simpler conditionals than non-causal factorizations. To evaluate the complexity of the conditionals we have tried two methods. First, we have compared them to those which maximize the conditional entropy subject to the observed first and second moments since we consider the latter as the simplest conditionals. Second, we have fitted the data with conditional probability measures being exponents of functions in an RKHS space and defined the complexity by a Hilbert-space semi-norm. Such a complexity measure has several properties that are useful for our purpose. We describe some encouraging results with both methods applied to real-world data. Moreover, we have combined constraint-based approaches to causal discovery (i.e., methods using only information on conditional statistical dependences) with our method in order to distinguish between causal hypotheses which are equivalent with respect to the imposed independences. Furthermore, we compare the performance to Bayesian approaches to causal inference

    Causal inference using the algorithmic Markov condition

    Full text link
    Inferring the causal structure that links n observables is usually based upon detecting statistical dependences and choosing simple graphs that make the joint measure Markovian. Here we argue why causal inference is also possible when only single observations are present. We develop a theory how to generate causal graphs explaining similarities between single objects. To this end, we replace the notion of conditional stochastic independence in the causal Markov condition with the vanishing of conditional algorithmic mutual information and describe the corresponding causal inference rules. We explain why a consistent reformulation of causal inference in terms of algorithmic complexity implies a new inference principle that takes into account also the complexity of conditional probability densities, making it possible to select among Markov equivalent causal graphs. This insight provides a theoretical foundation of a heuristic principle proposed in earlier work. We also discuss how to replace Kolmogorov complexity with decidable complexity criteria. This can be seen as an algorithmic analog of replacing the empirically undecidable question of statistical independence with practical independence tests that are based on implicit or explicit assumptions on the underlying distribution.Comment: 16 figure

    Distinguishing Cause and Effect via Second Order Exponential Models

    Full text link
    We propose a method to infer causal structures containing both discrete and continuous variables. The idea is to select causal hypotheses for which the conditional density of every variable, given its causes, becomes smooth. We define a family of smooth densities and conditional densities by second order exponential models, i.e., by maximizing conditional entropy subject to first and second statistical moments. If some of the variables take only values in proper subsets of R^n, these conditionals can induce different families of joint distributions even for Markov-equivalent graphs. We consider the case of one binary and one real-valued variable where the method can distinguish between cause and effect. Using this example, we describe that sometimes a causal hypothesis must be rejected because P(effect|cause) and P(cause) share algorithmic information (which is untypical if they are chosen independently). This way, our method is in the same spirit as faithfulness-based causal inference because it also rejects non-generic mutual adjustments among DAG-parameters.Comment: 36 pages, 8 figure
    corecore