8,882 research outputs found

    On-the-fly Uniformization of Time-Inhomogeneous Infinite Markov Population Models

    Full text link
    This paper presents an on-the-fly uniformization technique for the analysis of time-inhomogeneous Markov population models. This technique is applicable to models with infinite state spaces and unbounded rates, which are, for instance, encountered in the realm of biochemical reaction networks. To deal with the infinite state space, we dynamically maintain a finite subset of the states where most of the probability mass is located. This approach yields an underapproximation of the original, infinite system. We present experimental results to show the applicability of our technique

    Patterns of Scalable Bayesian Inference

    Full text link
    Datasets are growing not just in size but in complexity, creating a demand for rich models and quantification of uncertainty. Bayesian methods are an excellent fit for this demand, but scaling Bayesian inference is a challenge. In response to this challenge, there has been considerable recent work based on varying assumptions about model structure, underlying computational resources, and the importance of asymptotic correctness. As a result, there is a zoo of ideas with few clear overarching principles. In this paper, we seek to identify unifying principles, patterns, and intuitions for scaling Bayesian inference. We review existing work on utilizing modern computing resources with both MCMC and variational approximation techniques. From this taxonomy of ideas, we characterize the general principles that have proven successful for designing scalable inference procedures and comment on the path forward

    A tool for model-checking Markov chains

    Get PDF
    Markov chains are widely used in the context of the performance and reliability modeling of various systems. Model checking of such chains with respect to a given (branching) temporal logic formula has been proposed for both discrete [34, 10] and continuous time settings [7, 12]. In this paper, we describe a prototype model checker for discrete and continuous-time Markov chains, the Erlangen-Twente Markov Chain Checker EÎMC2, where properties are expressed in appropriate extensions of CTL. We illustrate the general benefits of this approach and discuss the structure of the tool. Furthermore, we report on successful applications of the tool to some examples, highlighting lessons learned during the development and application of EÎMC2

    Computing Battery Lifetime Distributions

    Get PDF
    The usage of mobile devices like cell phones, navigation systems, or laptop computers, is limited by the lifetime of the included batteries. This lifetime depends naturally on the rate at which energy is consumed, however, it also depends on the usage pattern of the battery. Continuous drawing of a high current results in an excessive drop of residual capacity. However, during \ud intervals with no or very small currents, batteries do recover to a certain extend. We model this complex behaviour with an inhomogeneous Markov reward model, following the approach of the so-called Kinetic battery Model (KiBaM). \ud The state-dependent reward rates thereby correspond to the power consumption of the attached device and to the available charge, respectively. We develop a tailored numerical algorithm for the computation of the distribution of the consumed energy and show how different workload patterns influence the overall lifetime of a battery
    corecore