2,190 research outputs found

    First passage process of a Markov additive process, with applications to reflection problems

    Get PDF
    In this paper we consider the first passage process of a spectrally negative Markov additive process (MAP). The law of this process is uniquely characterized by a certain matrix function, which plays a crucial role in fluctuation theory. We show how to identify this matrix using the theory of Jordan chains associated with analytic matrix functions. Importantly, our result also provides us with a technique, which can be used to derive various further identities. We then proceed to show how to compute the stationary distribution associated with a one-sided reflected (at zero) MAP for both the spectrally positive and spectrally negative cases as well as for the two sided reflected Markov-modulated Brownian motion; these results can be interpreted in terms of queues with MAP input.Comment: 16 page

    Stochastic Gradient Hamiltonian Monte Carlo

    Full text link
    Hamiltonian Monte Carlo (HMC) sampling methods provide a mechanism for defining distant proposals with high acceptance probabilities in a Metropolis-Hastings framework, enabling more efficient exploration of the state space than standard random-walk proposals. The popularity of such methods has grown significantly in recent years. However, a limitation of HMC methods is the required gradient computation for simulation of the Hamiltonian dynamical system-such computation is infeasible in problems involving a large sample size or streaming data. Instead, we must rely on a noisy gradient estimate computed from a subset of the data. In this paper, we explore the properties of such a stochastic gradient HMC approach. Surprisingly, the natural implementation of the stochastic approximation can be arbitrarily bad. To address this problem we introduce a variant that uses second-order Langevin dynamics with a friction term that counteracts the effects of the noisy gradient, maintaining the desired target distribution as the invariant distribution. Results on simulated data validate our theory. We also provide an application of our methods to a classification task using neural networks and to online Bayesian matrix factorization.Comment: ICML 2014 versio
    • …
    corecore