1 research outputs found

    MAXIMUM ENTROPY RELAXATION FOR MULTISCALE GRAPHICAL MODEL SELECTION

    No full text
    We consider the problem of learning multiscale graphical models. Given a collection of variables along with covariance specifications for these variables, we introduce hidden variables and learn a sparse graphical model approximation on the entire set of variables (original and hidden). Our method for learning such models is based on maximizing entropy over an exponential family of graphical models, subject to divergence constraints on small subsets of variables. We demonstrate the advantages of our approach compared to methods that do not use hidden variables (which do not capture long-range behavior) and methods that use tree-structure approximations (which result in blocky artifacts). Index Terms β€” Graphical models, multiscale models, maximum entropy principle, model selection, hidden variable
    corecore