38 research outputs found

    Studying the effects of adding spatiality to a process algebra model

    No full text
    We use NetLogo to create simulations of two models of disease transmission originally expressed in WSCCS. This allows us to introduce spatiality into the models and explore the consequences of having different contact structures among the agents. In previous work, mean field equations were derived from the WSCCS models, giving a description of the aggregate behaviour of the overall population of agents. These results turned out to differ from results obtained by another team using cellular automata models, which differ from process algebra by being inherently spatial. By using NetLogo we are able to explore whether spatiality, and resulting differences in the contact structures in the two kinds of models, are the reason for this different results. Our tentative conclusions, based at this point on informal observations of simulation results, are that space does indeed make a big difference. If space is ignored and individuals are allowed to mix randomly, then the simulations yield results that closely match the mean field equations, and consequently also match the associated global transmission terms (explained below). At the opposite extreme, if individuals can only contact their immediate neighbours, the simulation results are very different from the mean field equations (and also do not match the global transmission terms). These results are not surprising, and are consistent with other cellular automata-based approaches. We found that it was easy and convenient to implement and simulate the WSCCS models within NetLogo, and we recommend this approach to anyone wishing to explore the effects of introducing spatiality into a process algebra model

    Scalable context-dependent analysis of emergency egress models

    Get PDF
    Pervasive environments offer an increasing number of services to a large number of people moving within these environments, including timely information about where to go and when, and contextual information about the surrounding environment. This information may be conveyed to people through public displays or direct to a person's mobile phone. People using these services interact with the system but they are also meeting other people and performing other activities as relevant opportunities arise. The design of such systems and the analysis of collective dynamic behaviour of people within them is a challenging problem. We present results on a novel usage of a scalable analysis technique in this context. We show the validity of an approach based on stochastic process-algebraic models by focussing on a representative example, i.e. emergency egress. The chosen case study has the advantage that detailed data is available from studies employing alternative analysis methods, making cross-methodology comparison possible. We also illustrate how realistic, context-dependent human behaviour, often observed in emergency egress, can naturally be embedded in the models, and how the effect of such behaviour on evacuation can be analysed in an efficient and scalable way. The proposed approach encompasses both the agent modelling viewpoint, as system behaviour emerges from specific (discrete) agent interaction, and the population viewpoint, when classes of homogeneous individuals are considered for a (continuous)approximation of overall system behaviour

    Abstract Interpretation of PEPA Models

    Get PDF

    Evaluating the Robustness of Resource Allocations Obtained through Performance Modeling with Stochastic Process Algebra

    Get PDF
    Recent developments in the field of parallel and distributed computing has led to a proliferation of solving large and computationally intensive mathematical, science, or engineering problems, that consist of several parallelizable parts and several non-parallelizable (sequential) parts. In a parallel and distributed computing environment, the performance goal is to optimize the execution of parallelizable parts of an application on concurrent processors. This requires efficient application scheduling and resource allocation for mapping applications to a set of suitable parallel processors such that the overall performance goal is achieved. However, such computational environments are often prone to unpredictable variations in application (problem and algorithm) and system characteristics. Therefore, a robustness study is required to guarantee a desired level of performance. Given an initial workload, a mapping of applications to resources is considered to be robust if that mapping optimizes execution performance and guarantees a desired level of performance in the presence of unpredictable perturbations at runtime. In this research, a stochastic process algebra, Performance Evaluation Process Algebra (PEPA), is used for obtaining resource allocations via a numerical analysis of performance modeling of the parallel execution of applications on parallel computing resources. The PEPA performance model is translated into an underlying mathematical Markov chain model for obtaining performance measures. Further, a robustness analysis of the allocation techniques is performed for finding a robustmapping from a set of initial mapping schemes. The numerical analysis of the performance models have confirmed similarity with the simulation results of earlier research available in existing literature. When compared to direct experiments and simulations, numerical models and the corresponding analyses are easier to reproduce, do not incur any setup or installation costs, do not impose any prerequisites for learning a simulation framework, and are not limited by the complexity of the underlying infrastructure or simulation libraries
    corecore