3 research outputs found

    Heavy-tailed max-linear structural equation models in networks with hidden nodes

    Full text link
    Recursive max-linear vectors provide models for the causal dependence between large values of observed random variables as they are supported on directed acyclic graphs (DAGs). But the standard assumption that all nodes of such a DAG are observed is often unrealistic. We provide necessary and sufficient conditions that allow for a partially observed vector from a regularly varying model to be represented as a recursive max-linear (sub-)model. Our method relies on regular variation and the minimal representation of a recursive max-linear vector. Here the max-weighted paths of a DAG play an essential role. Results are based on a scaling technique and causal dependence relations between pairs of nodes. In certain cases our method can also detect the presence of hidden confounders. Under a two-step thresholding procedure, we show consistency and asymptotic normality of the estimators. Finally, we study our method by simulation, and apply it to nutrition intake data

    Estimating an extreme Bayesian network via scalings

    No full text
    A recursive max-linear vector models causal dependence between its components by expressing each node variable as a max-linear function of its parental nodes in a directed acyclic graph and some exogenous innovation. Motivated by extreme value theory, innovations are assumed to have regularly varying distribution tails. We propose a scaling technique in order to determine a causal order of the node variables. All dependence parameters are then estimated from the estimated scalings. Furthermore, we prove asymptotic normality of the estimated scalings and dependence parameters based on asymptotic normality of the empirical spectral measure. Finally, we apply our structure learning and estimation algorithm to financial data and food dietary interview data. (C) 2020 Elsevier Inc. All rights reserved
    corecore