1,728 research outputs found

    Auxiliary Guided Autoregressive Variational Autoencoders

    Get PDF
    Generative modeling of high-dimensional data is a key problem in machine learning. Successful approaches include latent variable models and autoregressive models. The complementary strengths of these approaches, to model global and local image statistics respectively, suggest hybrid models that encode global image structure into latent variables while autoregressively modeling low level detail. Previous approaches to such hybrid models restrict the capacity of the autoregressive decoder to prevent degenerate models that ignore the latent variables and only rely on autoregressive modeling. Our contribution is a training procedure relying on an auxiliary loss function that controls which information is captured by the latent variables and what is left to the autoregressive decoder. Our approach can leverage arbitrarily powerful autoregressive decoders, achieves state-of-the art quantitative performance among models with latent variables, and generates qualitatively convincing samples.Comment: Published as a conference paper at ECML-PKDD 201

    Theoretical Foundations of Autoregressive Models for Time Series on Acyclic Directed Graphs

    Get PDF
    Three classes of models for time series on acyclic directed graphs are considered. At first a review of tree-structured models constructed from a nested partitioning of the observation interval is given. This nested partitioning leads to several resolution scales. The concept of mass balance allowing to interpret the average over an interval as the sum of averages over the sub-intervals implies linear restrictions in the tree-structured model. Under a white noise assumption for transition and observation noise there is an change-of-resolution Kalman filter for linear least squares prediction of interval averages \shortcite{chou:1991}. This class of models is generalized by modeling transition noise on the same scale in linear state space form. The third class deals with models on a more general class of directed acyclic graphs where nodes are allowed to have two parents. We show that these models have a linear state space representation with white system and coloured observation noise
    • …
    corecore