859 research outputs found

    Binary Models for Marginal Independence

    Full text link
    Log-linear models are a classical tool for the analysis of contingency tables. In particular, the subclass of graphical log-linear models provides a general framework for modelling conditional independences. However, with the exception of special structures, marginal independence hypotheses cannot be accommodated by these traditional models. Focusing on binary variables, we present a model class that provides a framework for modelling marginal independences in contingency tables. The approach taken is graphical and draws on analogies to multivariate Gaussian models for marginal independence. For the graphical model representation we use bi-directed graphs, which are in the tradition of path diagrams. We show how the models can be parameterized in a simple fashion, and how maximum likelihood estimation can be performed using a version of the Iterated Conditional Fitting algorithm. Finally we consider combining these models with symmetry restrictions

    Graphical Markov models: overview

    Full text link
    We describe how graphical Markov models started to emerge in the last 40 years, based on three essential concepts that had been developed independently more than a century ago. Sequences of joint or single regressions and their regression graphs are singled out as being best suited for analyzing longitudinal data and for tracing developmental pathways. Interpretations are illustrated using two sets of data and some of the more recent, important results for sequences of regressions are summarized.Comment: 22 pages, 9 figure

    Graphical Markov models, unifying results and their interpretation

    Full text link
    Graphical Markov models combine conditional independence constraints with graphical representations of stepwise data generating processes.The models started to be formulated about 40 years ago and vigorous development is ongoing. Longitudinal observational studies as well as intervention studies are best modeled via a subclass called regression graph models and, especially traceable regressions. Regression graphs include two types of undirected graph and directed acyclic graphs in ordered sequences of joint responses. Response components may correspond to discrete or continuous random variables and may depend exclusively on variables which have been generated earlier. These aspects are essential when causal hypothesis are the motivation for the planning of empirical studies. To turn the graphs into useful tools for tracing developmental pathways and for predicting structure in alternative models, the generated distributions have to mimic some properties of joint Gaussian distributions. Here, relevant results concerning these aspects are spelled out and illustrated by examples. With regression graph models, it becomes feasible, for the first time, to derive structural effects of (1) ignoring some of the variables, of (2) selecting subpopulations via fixed levels of some other variables or of (3) changing the order in which the variables might get generated. Thus, the most important future applications of these models will aim at the best possible integration of knowledge from related studies.Comment: 34 Pages, 11 figures, 1 tabl

    The Grow-Shrink strategy for learning Markov network structures constrained by context-specific independences

    Full text link
    Markov networks are models for compactly representing complex probability distributions. They are composed by a structure and a set of numerical weights. The structure qualitatively describes independences in the distribution, which can be exploited to factorize the distribution into a set of compact functions. A key application for learning structures from data is to automatically discover knowledge. In practice, structure learning algorithms focused on "knowledge discovery" present a limitation: they use a coarse-grained representation of the structure. As a result, this representation cannot describe context-specific independences. Very recently, an algorithm called CSPC was designed to overcome this limitation, but it has a high computational complexity. This work tries to mitigate this downside presenting CSGS, an algorithm that uses the Grow-Shrink strategy for reducing unnecessary computations. On an empirical evaluation, the structures learned by CSGS achieve competitive accuracies and lower computational complexity with respect to those obtained by CSPC.Comment: 12 pages, and 8 figures. This works was presented in IBERAMIA 201

    Concepts and a case study for a flexible class of graphical Markov models

    Full text link
    With graphical Markov models, one can investigate complex dependences, summarize some results of statistical analyses with graphs and use these graphs to understand implications of well-fitting models. The models have a rich history and form an area that has been intensively studied and developed in recent years. We give a brief review of the main concepts and describe in more detail a flexible subclass of models, called traceable regressions. These are sequences of joint response regressions for which regression graphs permit one to trace and thereby understand pathways of dependence. We use these methods to reanalyze and interpret data from a prospective study of child development, now known as the Mannheim Study of Children at Risk. The two related primary features concern cognitive and motor development, at the age of 4.5 and 8 years of a child. Deficits in these features form a sequence of joint responses. Several possible risks are assessed at birth of the child and when the child reached age 3 months and 2 years.Comment: 21 pages, 7 figures, 7 tables; invited, refereed chapter in a boo
    corecore