12,866 research outputs found

    Representation of Markov chains by random maps: existence and regularity conditions

    Full text link
    We systematically investigate the problem of representing Markov chains by families of random maps, and which regularity of these maps can be achieved depending on the properties of the probability measures. Our key idea is to use techniques from optimal transport to select optimal such maps. Optimal transport theory also tells us how convexity properties of the supports of the measures translate into regularity properties of the maps via Legendre transforms. Thus, from this scheme, we cannot only deduce the representation by measurable random maps, but we can also obtain conditions for the representation by continuous random maps. Finally, we present conditions for the representation of Markov chain by random diffeomorphisms.Comment: 22 pages, several changes from the previous version including extended discussion of many detail

    Kernel methods in machine learning

    Full text link
    We review machine learning methods employing positive definite kernels. These methods formulate learning and estimation problems in a reproducing kernel Hilbert space (RKHS) of functions defined on the data domain, expanded in terms of a kernel. Working in linear spaces of function has the benefit of facilitating the construction and analysis of learning algorithms while at the same time allowing large classes of functions. The latter include nonlinear functions as well as functions defined on nonvectorial data. We cover a wide range of methods, ranging from binary classifiers to sophisticated methods for estimation with structured data.Comment: Published in at http://dx.doi.org/10.1214/009053607000000677 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    On Graphical Models via Univariate Exponential Family Distributions

    Full text link
    Undirected graphical models, or Markov networks, are a popular class of statistical models, used in a wide variety of applications. Popular instances of this class include Gaussian graphical models and Ising models. In many settings, however, it might not be clear which subclass of graphical models to use, particularly for non-Gaussian and non-categorical data. In this paper, we consider a general sub-class of graphical models where the node-wise conditional distributions arise from exponential families. This allows us to derive multivariate graphical model distributions from univariate exponential family distributions, such as the Poisson, negative binomial, and exponential distributions. Our key contributions include a class of M-estimators to fit these graphical model distributions; and rigorous statistical analysis showing that these M-estimators recover the true graphical model structure exactly, with high probability. We provide examples of genomic and proteomic networks learned via instances of our class of graphical models derived from Poisson and exponential distributions.Comment: Journal of Machine Learning Researc
    corecore