67,422 research outputs found

    Factorial graphical lasso for dynamic networks

    Full text link
    Dynamic networks models describe a growing number of important scientific processes, from cell biology and epidemiology to sociology and finance. There are many aspects of dynamical networks that require statistical considerations. In this paper we focus on determining network structure. Estimating dynamic networks is a difficult task since the number of components involved in the system is very large. As a result, the number of parameters to be estimated is bigger than the number of observations. However, a characteristic of many networks is that they are sparse. For example, the molecular structure of genes make interactions with other components a highly-structured and therefore sparse process. Penalized Gaussian graphical models have been used to estimate sparse networks. However, the literature has focussed on static networks, which lack specific temporal constraints. We propose a structured Gaussian dynamical graphical model, where structures can consist of specific time dynamics, known presence or absence of links and block equality constraints on the parameters. Thus, the number of parameters to be estimated is reduced and accuracy of the estimates, including the identification of the network, can be tuned up. Here, we show that the constrained optimization problem can be solved by taking advantage of an efficient solver, logdetPPA, developed in convex optimization. Moreover, model selection methods for checking the sensitivity of the inferred networks are described. Finally, synthetic and real data illustrate the proposed methodologies.Comment: 30 pp, 5 figure

    Bethe Projections for Non-Local Inference

    Full text link
    Many inference problems in structured prediction are naturally solved by augmenting a tractable dependency structure with complex, non-local auxiliary objectives. This includes the mean field family of variational inference algorithms, soft- or hard-constrained inference using Lagrangian relaxation or linear programming, collective graphical models, and forms of semi-supervised learning such as posterior regularization. We present a method to discriminatively learn broad families of inference objectives, capturing powerful non-local statistics of the latent variables, while maintaining tractable and provably fast inference using non-Euclidean projected gradient descent with a distance-generating function given by the Bethe entropy. We demonstrate the performance and flexibility of our method by (1) extracting structured citations from research papers by learning soft global constraints, (2) achieving state-of-the-art results on a widely-used handwriting recognition task using a novel learned non-convex inference procedure, and (3) providing a fast and highly scalable algorithm for the challenging problem of inference in a collective graphical model applied to bird migration.Comment: minor bug fix to appendix. appeared in UAI 201
    • …
    corecore