3 research outputs found

    Simultaneous transient analysis of QBD Markov chains for all initial configurations using a level-based recursion

    No full text
    A new algorithm to assess transient performance measures for every possible initial configuration of a Quasi-Birth-and-Death (QBD) Markov chain is introduced. We make use of the framework termed QBDs with marked time epochs that transforms the transient problem into a stationary one by applying a discrete Erlangization and constructing a reset Markov chain. To avoid the need to repeat all computations for each initial configuration, we propose a level based recursive algorithm that uses intermediate results obtained for initial states belonging to levels 0,..., r − 1 to compute the transient measure when the initial state is part of level r. Also, the computations for all states belonging to level r are performed simultaneously. A key property of our approach lies in the exploitation of the internal structure of the block matrices involved, avoiding any need to store large matrices. A flexible Matlab implementation of the proposed algorithm is available online. 1

    A New Look at Matrix Analytic Methods

    Get PDF
    In the past several decades, matrix analytic methods have proven effective at studying two important sub-classes of block-structured Markov processes: G/M/1-type Markov processes and M/G/1-type Markov processes. These processes are often used to model many types of random phenomena due to their underlying primitives having phase-type distributions. When studying block-structured Markov processes and its sub-classes, two key quantities are the “rate matrix” R and a matrix of probabilities typically denoted G. In [30], Neuts shows that the stationary distribution of a Markov process of G/M/1-type, when it exists, possess a matrix-geometric relationship with R. Ramaswami’s formula [32] shows that the stationary distribution of an M/G/1-type Markov process satisfies a recursion involving a well-defined matrix of probabilities, typically denoted as G. The first result we present is a new derivation of the stationary distribution for Markov processes of G/M/1-type using the random-product theory found in Buckingham and Fralix [9]. This method can also be modified to derive the Laplace transform of each transition function associated with a G/M/1-type Markov process. Next, we study the time-dependent behavior of block-structured Markov processes. In [15], Grassmann and Heyman show that the stationary distribution of block-structured Markov processes can be expressed in terms of infinitely many R and G matrices. We show that the Laplace transforms of the transition functions associated with block-structured Markov processes satisfies a recursion involving an infinite collection of R matrices. The R matrices are shown to be able to be expressed in terms of an infinite collection of G matrices, which are solutions to fixed-point equations and can be computed iteratively. Our final result uses the random-product theory to a study an M/M/1 queueing model in a two state random environment. Though such a model is a block-structured Markov process, we avoid computing any R or G matrices and instead show that the stationary distribution can be written exactly as a linear combination of scalars that can be determined recursively
    corecore