669 research outputs found

    Asymptotically optimal importance sampling for Jackson networks with a tree topology

    Get PDF
    This note describes an importance sampling (IS) algorithm to estimate buffer overflows of stable Jackson networks with a tree topology. Three new measures of service capacity and traffic in Jackson networks are introduced and the algorithm is defined in their terms. These measures are effective service rate, effective utilization and effective service-to-arrival ratio of a node. They depend on the nonempty/empty states of the queues of the network. For a node with a nonempty queue, the effective service rate equals the node's nominal service rate. For a node i with an empty queue, it is either a weighted sum of the effective service rates of the nodes receiving traffic directly from node i, or the nominal service rate, whichever smaller. The effective utilization is the ratio of arrival rate to the effective service rate and the effective service-to-arrival ratio is its reciprocal. The rare overflow event of interest is the following: given that initially the network is empty, the system experiences a buffer overflow before returning to the empty state. Two types of buffer structures are considered: (1) a single system-wide buffer shared by all nodes, and (2) each node has its own fixed size buffer. The constructed IS algorithm is asymptotically optimal, i. e., the variance of the associated estimator decays exponentially in the buffer size at the maximum possible rate. This is proved using methods from (Dupuis et al. in Ann. Appl. Probab. 17(4): 1306-1346, 2007), which are based on a limit Hamilton-Jacobi-Bellman equation and its boundary conditions and their smooth subsolutions. Numerical examples involving networks with as many as eight nodes are provided

    Efficient simulation of large deviation events for sums of random vectors using saddle-point representations

    Get PDF
    We consider the problem of efficient simulation estimation of the density function at the tails, and the probability of large deviations for a sum of independent, identically distributed (i.i.d.), light-tailed and nonlattice random vectors. The latter problem besides being of independent interest, also forms a building block for more complex rare event problems that arise, for instance, in queuing and financial credit risk modeling. It has been extensively studied in the literature where state-independent, exponential-twisting-based importance sampling has been shown to be asymptotically efficient and a more nuanced state-dependent exponential twisting has been shown to have a stronger bounded relative error property. We exploit the saddle-point-based representations that exist for these rare quantities, which rely on inverting the characteristic functions of the underlying random vectors. These representations reduce the rare event estimation problem to evaluating certain integrals, which may via importance sampling be represented as expectations. Furthermore, it is easy to identify and approximate the zero-variance importance sampling distribution to estimate these integrals. We identify such importance sampling measures and show that they possess the asymptotically vanishing relative error property that is stronger than the bounded relative error property. To illustrate the broader applicability of the proposed methodology, we extend it to develop an asymptotically vanishing relative error estimator for the practically important expected overshoot of sums of i.i.d. random variables

    Efficient simulation of density and probability of large deviations of sum of random vectors using saddle point representations

    Get PDF
    We consider the problem of efficient simulation estimation of the density function at the tails, and the probability of large deviations for a sum of independent, identically distributed, light-tailed and non-lattice random vectors. The latter problem besides being of independent interest, also forms a building block for more complex rare event problems that arise, for instance, in queueing and financial credit risk modelling. It has been extensively studied in literature where state independent exponential twisting based importance sampling has been shown to be asymptotically efficient and a more nuanced state dependent exponential twisting has been shown to have a stronger bounded relative error property. We exploit the saddle-point based representations that exist for these rare quantities, which rely on inverting the characteristic functions of the underlying random vectors. These representations reduce the rare event estimation problem to evaluating certain integrals, which may via importance sampling be represented as expectations. Further, it is easy to identify and approximate the zero-variance importance sampling distribution to estimate these integrals. We identify such importance sampling measures and show that they possess the asymptotically vanishing relative error property that is stronger than the bounded relative error property. To illustrate the broader applicability of the proposed methodology, we extend it to similarly efficiently estimate the practically important expected overshoot of sums of iid random variables

    Linear Stochastic Fluid Networks: Rare-Event Simulation and Markov Modulation

    Get PDF
    We consider a linear stochastic fluid network under Markov modulation, with a focus on the probability that the joint storage level attains a value in a rare set at a given point in time. The main objective is to develop efficient importance sampling algorithms with provable performance guarantees. For linear stochastic fluid networks without modulation, we prove that the number of runs needed (so as to obtain an estimate with a given precision) increases polynomially (whereas the probability under consideration decays essentially exponentially); for networks operating in the slow modulation regime, our algorithm is asymptotically efficient. Our techniques are in the tradition of the rare-event simulation procedures that were developed for the sample-mean of i.i.d. one-dimensional light-tailed random variables, and intensively use the idea of exponential twisting. In passing, we also point out how to set up a recursion to evaluate the (transient and stationary) moments of the joint storage level in Markov-modulated linear stochastic fluid networks

    Adaptive importance sampling technique for markov chains using stochastic approximation

    Get PDF
    For a discrete-time finite-state Markov chain, we develop an adaptive importance sampling scheme to estimate the expected total cost before hitting a set of terminal states. This scheme updates the change of measure at every transition using constant or decreasing step-size stochastic approximation. The updates are shown to concentrate asymptotically in a neighborhood of the desired zero-variance estimator. Through simulation experiments on simple Markovian queues, we observe that the proposed technique performs very well in estimating performance measures related to rare events associated with queue lengths exceeding prescribed thresholds. We include performance comparisons of the proposed algorithm with existing adaptive importance sampling algorithms on some examples. We also discuss the extension of the technique to estimate the infinite horizon expected discounted cost and the expected average cost

    Variance Reduction Techniques in Monte Carlo Methods

    Get PDF
    Monte Carlo methods are simulation algorithms to estimate a numerical quantity in a statistical model of a real system. These algorithms are executed by computer programs. Variance reduction techniques (VRT) are needed, even though computer speed has been increasing dramatically, ever since the introduction of computers. This increased computer power has stimulated simulation analysts to develop ever more realistic models, so that the net result has not been faster execution of simulation experiments; e.g., some modern simulation models need hours or days for a single ’run’ (one replication of one scenario or combination of simulation input values). Moreover there are some simulation models that represent rare events which have extremely small probabilities of occurrence), so even modern computer would take ’for ever’ (centuries) to execute a single run - were it not that special VRT can reduce theses excessively long runtimes to practical magnitudes.common random numbers;antithetic random numbers;importance sampling;control variates;conditioning;stratied sampling;splitting;quasi Monte Carlo

    Network Tomography: Identifiability and Fourier Domain Estimation

    Full text link
    The statistical problem for network tomography is to infer the distribution of X\mathbf{X}, with mutually independent components, from a measurement model Y=AX\mathbf{Y}=A\mathbf{X}, where AA is a given binary matrix representing the routing topology of a network under consideration. The challenge is that the dimension of X\mathbf{X} is much larger than that of Y\mathbf{Y} and thus the problem is often called ill-posed. This paper studies some statistical aspects of network tomography. We first address the identifiability issue and prove that the X\mathbf{X} distribution is identifiable up to a shift parameter under mild conditions. We then use a mixture model of characteristic functions to derive a fast algorithm for estimating the distribution of X\mathbf{X} based on the General method of Moments. Through extensive model simulation and real Internet trace driven simulation, the proposed approach is shown to be favorable comparing to previous methods using simple discretization for inferring link delays in a heterogeneous network.Comment: 21 page
    • …
    corecore