3,483 research outputs found

    Approximate performability and dependability analysis using generalized stochastic Petri Nets

    Get PDF
    Since current day fault-tolerant and distributed computer and communication systems tend to be large and complex, their corresponding performability models will suffer from the same characteristics. Therefore, calculating performability measures from these models is a difficult and time-consuming task.\ud \ud To alleviate the largeness and complexity problem to some extent we use generalized stochastic Petri nets to describe to models and to automatically generate the underlying Markov reward models. Still however, many models cannot be solved with the current numerical techniques, although they are conveniently and often compactly described.\ud \ud In this paper we discuss two heuristic state space truncation techniques that allow us to obtain very good approximations for the steady-state performability while only assessing a few percent of the states of the untruncated model. For a class of reversible models we derive explicit lower and upper bounds on the exact steady-state performability. For a much wider class of models a truncation theorem exists that allows one to obtain bounds for the error made in the truncation. We discuss this theorem in the context of approximate performability models and comment on its applicability. For all the proposed truncation techniques we present examples showing their usefulness

    From the arrow of time in Badiali's quantum approach to the dynamic meaning of Riemann's hypothesis

    Get PDF
    The novelty of the Jean Pierre Badiali last scientific works stems to a quantum approach based on both (i) a return to the notion of trajectories (Feynman paths) and (ii) an irreversibility of the quantum transitions. These iconoclastic choices find again the Hilbertian and the von Neumann algebraic point of view by dealing statistics over loops. This approach confers an external thermodynamic origin to the notion of a quantum unit of time (Rovelli Connes' thermal time). This notion, basis for quantization, appears herein as a mere criterion of parting between the quantum regime and the thermodynamic regime. The purpose of this note is to unfold the content of the last five years of scientific exchanges aiming to link in a coherent scheme the Jean Pierre's choices and works, and the works of the authors of this note based on hyperbolic geodesics and the associated role of Riemann zeta functions. While these options do not unveil any contradictions, nevertheless they give birth to an intrinsic arrow of time different from the thermal time. The question of the physical meaning of Riemann hypothesis as the basis of quantum mechanics, which was at the heart of our last exchanges, is the backbone of this note.Comment: 13 pages, 2 figure

    On the utility of Metropolis-Hastings with asymmetric acceptance ratio

    Full text link
    The Metropolis-Hastings algorithm allows one to sample asymptotically from any probability distribution π\pi. There has been recently much work devoted to the development of variants of the MH update which can handle scenarios where such an evaluation is impossible, and yet are guaranteed to sample from π\pi asymptotically. The most popular approach to have emerged is arguably the pseudo-marginal MH algorithm which substitutes an unbiased estimate of an unnormalised version of π\pi for π\pi. Alternative pseudo-marginal algorithms relying instead on unbiased estimates of the MH acceptance ratio have also been proposed. These algorithms can have better properties than standard PM algorithms. Convergence properties of both classes of algorithms are known to depend on the variability of the estimators involved and reduced variability is guaranteed to decrease the asymptotic variance of ergodic averages and will shorten the burn-in period, or convergence to equilibrium, in most scenarios of interest. A simple approach to reduce variability, amenable to parallel computations, consists of averaging independent estimators. However, while averaging estimators of π\pi in a pseudo-marginal algorithm retains the guarantee of sampling from π\pi asymptotically, naive averaging of acceptance ratio estimates breaks detailed balance, leading to incorrect results. We propose an original methodology which allows for a correct implementation of this idea. We establish theoretical properties which parallel those available for standard PM algorithms and discussed above. We demonstrate the interest of the approach on various inference problems. In particular we show that convergence to equilibrium can be significantly shortened, therefore offering the possibility to reduce a user's waiting time in a generic fashion when a parallel computing architecture is available

    A nonequilibrium thermodynamics perspective on nature-inspired chemical engineering processes

    Get PDF
    Nature-inspired chemical engineering (NICE) is promising many benefits in terms of energy consumption, resilience and efficiencyetc.but it struggles to emerge as a leading discipline, chiefly because of the misconception that mimicking Nature is sufficient. It is not, since goals and constrained context are different. Hence, revealing context and understanding the mechanisms of nature-inspiration should be encouraged. In this contribution we revisit the classification of three published mechanisms underlying nature-inspired engineering, namely hierarchical transport network, force balancing and dynamic self-organization, by setting them in a broader framework supported by nonequilibrium thermodynamics, the constructal law and nonlinear control concepts. While the three mechanisms mapping is not complete, the NET and CL joint framework opens also new perspectives. This novel perspective goes over classical chemical engineering where equilibrium based assumptions or linear transport phenomena and control are the ruling mechanisms in process unit design and operation. At small-scale level, NICE processes should sometimes consider advanced thermodynamic concepts to account for fluctuations and boundary effects on local properties. At the process unit level, one should exploit out-of-equilibrium situations with thermodynamic coupling under various dynamical states, be it a stationary state or a self-organized state. Then, nonlinear phenomena, possibly provoked by operating larger driving force to achieve greater dissipative flows, might occur, controllable by using nonlinear control theory. At the plant level, the virtual factory approach relying on servitization and modular equipment proposes a framework for knowledge and information management that could lead to resilient and agile chemical plants, especially biorefineries

    Bayesian computational methods

    Full text link
    In this chapter, we will first present the most standard computational challenges met in Bayesian Statistics, focussing primarily on mixture estimation and on model choice issues, and then relate these problems with computational solutions. Of course, this chapter is only a terse introduction to the problems and solutions related to Bayesian computations. For more complete references, see Robert and Casella (2004, 2009), or Marin and Robert (2007), among others. We also restrain from providing an introduction to Bayesian Statistics per se and for comprehensive coverage, address the reader to Robert (2007), (again) among others.Comment: This is a revised version of a chapter written for the Handbook of Computational Statistics, edited by J. Gentle, W. Hardle and Y. Mori in 2003, in preparation for the second editio

    Discussion of "Geodesic Monte Carlo on Embedded Manifolds"

    Full text link
    Contributed discussion and rejoinder to "Geodesic Monte Carlo on Embedded Manifolds" (arXiv:1301.6064)Comment: Discussion of arXiv:1301.6064. To appear in the Scandinavian Journal of Statistics. 18 page

    Computational statistics using the Bayesian Inference Engine

    Full text link
    This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimised software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organise and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasises hybrid tempered MCMC schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE is implements a full persistence or serialisation system that stores the full byte-level image of the running inference and previously characterised posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU GPL.Comment: Resubmitted version. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU GP
    corecore