113,173 research outputs found

    Nonlinear Markov Processes in Big Networks

    Full text link
    Big networks express various large-scale networks in many practical areas such as computer networks, internet of things, cloud computation, manufacturing systems, transportation networks, and healthcare systems. This paper analyzes such big networks, and applies the mean-field theory and the nonlinear Markov processes to set up a broad class of nonlinear continuous-time block-structured Markov processes, which can be applied to deal with many practical stochastic systems. Firstly, a nonlinear Markov process is derived from a large number of interacting big networks with symmetric interactions, each of which is described as a continuous-time block-structured Markov process. Secondly, some effective algorithms are given for computing the fixed points of the nonlinear Markov process by means of the UL-type RG-factorization. Finally, the Birkhoff center, the Lyapunov functions and the relative entropy are used to analyze stability or metastability of the big network, and several interesting open problems are proposed with detailed interpretation. We believe that the results given in this paper can be useful and effective in the study of big networks.Comment: 28 pages in Special Matrices; 201

    Fast MCMC sampling for Markov jump processes and extensions

    Full text link
    Markov jump processes (or continuous-time Markov chains) are a simple and important class of continuous-time dynamical systems. In this paper, we tackle the problem of simulating from the posterior distribution over paths in these models, given partial and noisy observations. Our approach is an auxiliary variable Gibbs sampler, and is based on the idea of uniformization. This sets up a Markov chain over paths by alternately sampling a finite set of virtual jump times given the current path and then sampling a new path given the set of extant and virtual jump times using a standard hidden Markov model forward filtering-backward sampling algorithm. Our method is exact and does not involve approximations like time-discretization. We demonstrate how our sampler extends naturally to MJP-based models like Markov-modulated Poisson processes and continuous-time Bayesian networks and show significant computational benefits over state-of-the-art MCMC samplers for these models.Comment: Accepted at the Journal of Machine Learning Research (JMLR

    Using imprecise continuous time Markov chains for assessing the reliability of power networks with common cause failure and non-immediate repair.

    Get PDF
    We explore how imprecise continuous time Markov chains can improve traditional reliability models based on precise continuous time Markov chains. Specifically, we analyse the reliability of power networks under very weak statistical assumptions, explicitly accounting for non-stationary failure and repair rates and the limited accuracy by which common cause failure rates can be estimated. Bounds on typical quantities of interest are derived, namely the expected time spent in system failure state, as well as the expected number of transitions to that state. A worked numerical example demonstrates the theoretical techniques described. Interestingly, the number of iterations required for convergence is observed to be much lower than current theoretical bounds

    Elimination of Intermediate Species in Multiscale Stochastic Reaction Networks

    Full text link
    We study networks of biochemical reactions modelled by continuous-time Markov processes. Such networks typically contain many molecular species and reactions and are hard to study analytically as well as by simulation. Particularly, we are interested in reaction networks with intermediate species such as the substrate-enzyme complex in the Michaelis-Menten mechanism. These species are virtually in all real-world networks, they are typically short-lived, degraded at a fast rate and hard to observe experimentally. We provide conditions under which the Markov process of a multiscale reaction network with intermediate species is approximated in finite dimensional distribution by the Markov process of a simpler reduced reaction network without intermediate species. We do so by embedding the Markov processes into a one-parameter family of processes, where reaction rates and species abundances are scaled in the parameter. Further, we show that there are close links between these stochastic models and deterministic ODE models of the same networks

    Fast MCMC sampling for Markov jump processes and continuous time Bayesian networks

    Full text link
    Markov jump processes and continuous time Bayesian networks are important classes of continuous time dynamical systems. In this paper, we tackle the problem of inferring unobserved paths in these models by introducing a fast auxiliary variable Gibbs sampler. Our approach is based on the idea of uniformization, and sets up a Markov chain over paths by sampling a finite set of virtual jump times and then running a standard hidden Markov model forward filtering-backward sampling algorithm over states at the set of extant and virtual jump times. We demonstrate significant computational benefits over a state-of-the-art Gibbs sampler on a number of continuous time Bayesian networks
    • …
    corecore