824 research outputs found

    Process algebra for performance evaluation

    Get PDF
    This paper surveys the theoretical developments in the field of stochastic process algebras, process algebras where action occurrences may be subject to a delay that is determined by a random variable. A huge class of resource-sharing systems – like large-scale computers, client–server architectures, networks – can accurately be described using such stochastic specification formalisms. The main emphasis of this paper is the treatment of operational semantics, notions of equivalence, and (sound and complete) axiomatisations of these equivalences for different types of Markovian process algebras, where delays are governed by exponential distributions. Starting from a simple actionless algebra for describing time-homogeneous continuous-time Markov chains, we consider the integration of actions and random delays both as a single entity (like in known Markovian process algebras like TIPP, PEPA and EMPA) and as separate entities (like in the timed process algebras timed CSP and TCCS). In total we consider four related calculi and investigate their relationship to existing Markovian process algebras. We also briefly indicate how one can profit from the separation of time and actions when incorporating more general, non-Markovian distributions

    KKF-Model Platform Coupling : summary report KKF01b

    Get PDF
    Nederland bereidt zich voor op een sneller stijgende zeespiegel en een veranderend klimaat. Hiervoor is het Deltaprogramma gestart. Dit deltaprogramma voorziet een serie beslissingen die grote gevolgen zullen hebben voor het beheer van het water in Nederland. Om deze beslissingen zorgvuldig te nemen is informatie nodig over hoe het klimaat en de stijgende zeespiegel dit waterbeheer zullen beïnvloeden. De modellen die de gevolgen van klimaatverandering berekenen zullen daarom met dezelfde klimaat forcering en gekoppeld aan elkaar moeten worden gebruikt. In dit onderzoek is gekeken naar het linken van hydrologische en hydrodynamische modellen – en daaraan gekoppelde modellen die de ontwikkelingen in natuur en landgebruik modelleren -- die het gebied van de Alpen tot en met de Noordzee inclusief Nederland beschrijven

    Analysis of a Multimedia Stream using Stochastic Process Algebra

    Get PDF
    It is now well recognised that the next generation of distributed systems will be distributed multimedia systems. Central to multimedia systems is quality of service, which defines the non-functional requirements on the system. In this paper we investigate how stochastic process algebra can be used in order to determine the quality of service properties of distributed multimedia systems. We use a simple multimedia stream as our basic example. We describe it in the Stochastic Process Algebra PEPA and then we analyse whether the stream satisfies a set of quality of service parameters: throughput, end-to-end latency, jitter and error rates

    Distributed systems : architecture-driven specification using extended LOTOS

    Get PDF
    The thesis uses the LOTOS language (ISO International Standard ISO 8807) as a basis for the formal specification of distributed systems. Contributions are made to two key research areas: architecture-driven specification and LOTOS language extensions. The notion of architecture-driven specification is to guide the specification process by providing a reference-base of pre-defined domain-specific components. The thesis builds an infra-structure of architectural elements, and provides Extended LOTOS (XL) definitions of these elements. The thesis develops Extended LOTOS (XI.) for the specification of distributed systems. XL- is LOTOS enhanced with features for the formal specification of quantitative timing. probabilistic and priority requirements. For distributed systems, the specification of these ‘performance’ requirements, ran be as important as the specification of the associated functional requirements. To support quantitative timing features, the XL semantics define a global, discrete clock which can be used both to force events to occur at specific times, and to measure Intervals between event occurrences. XL introduces time policy operators ASAP (as soon as possible’ corresponding to “maximal progress semantics") and ALAP (late as possible'). Special internal transitions are introduced in XL semantics for the specification of probability, Conformance relations based on a notion of probabilization, together with a testing framework, are defined to support reasoning about probabilistic XL specifications. Priority within the XL semantics ensures that permitted events with the highest priority weighting of their class are allowed first. Both functional and performance specification play important roles in CIM (Computer Integrated Manufacturing) systems. The thesis uses a CIM system known as the CIM- OSA lntegrating Infrastructure as a case study of architecture-driven specification using XL. The thesis thus constitutes a step in the evolution of distributed system specification methods that have both an architectural basis and a formal basis

    Generating a Performance Stochastic Model from UML Specifications

    Full text link
    Since its initiation by Connie Smith, the process of Software Performance Engineering (SPE) is becoming a growing concern. The idea is to bring performance evaluation into the software design process. This suitable methodology allows software designers to determine the performance of software during design. Several approaches have been proposed to provide such techniques. Some of them propose to derive from a UML (Unified Modeling Language) model a performance model such as Stochastic Petri Net (SPN) or Stochastic process Algebra (SPA) models. Our work belongs to the same category. We propose to derive from a UML model a Stochastic Automata Network (SAN) in order to obtain performance predictions. Our approach is more flexible due to the SAN modularity and its high resemblance to UML' state-chart diagram

    On the Expressiveness of Markovian Process Calculi with Durational and Durationless Actions

    Full text link
    Several Markovian process calculi have been proposed in the literature, which differ from each other for various aspects. With regard to the action representation, we distinguish between integrated-time Markovian process calculi, in which every action has an exponentially distributed duration associated with it, and orthogonal-time Markovian process calculi, in which action execution is separated from time passing. Similar to deterministically timed process calculi, we show that these two options are not irreconcilable by exhibiting three mappings from an integrated-time Markovian process calculus to an orthogonal-time Markovian process calculus that preserve the behavioral equivalence of process terms under different interpretations of action execution: eagerness, laziness, and maximal progress. The mappings are limited to classes of process terms of the integrated-time Markovian process calculus with restrictions on parallel composition and do not involve the full capability of the orthogonal-time Markovian process calculus of expressing nondeterministic choices, thus elucidating the only two important differences between the two calculi: their synchronization disciplines and their ways of solving choices

    Formalization and Model Checking of BPMN Collaboration Diagrams with DD-LOTOS

    Get PDF
    Business Process Model and Notation (BPMN) is a standard graphical notation for modeling complex business processes. Given the importance of business processes, the modeling analysis and validation stage for BPMN is essential. In recent years, BPMN notation has become a widespread practice in business process modeling because of these intuitive diagrams. BPMN diagrams are built from basic elements. The major challenge of BPMN diagrams is the lack of formal semantics, which leads to several interpretations of the concerned diagrams. Hence, this work aims to propose an approach for checking BPMN collaboration diagrams to guarantee some properties of smooth functioning of systems modeled by BPMN notation. The verification approach used in this work is based on model checking techniques. The approach proposes as a first step a formal semantics of the collaboration diagrams in terms of the formal language DD-LOTOS, i.e., a phase of the transformation of collaboration diagrams into DD-LOTOS. This transformation is guided by applying the inference rules of the formal semantics of the DD-LOTOS formal language, and we then use the UPPAAL model checker to check the absence of deadlock, safety properties, and liveness properties

    Uncertainties in emission inventories

    Get PDF
    Emission inventories provide information about the amount of a pollutant that is emitted to the atmosphere as a result of a specific anthropogenic or natural process at a given time or place. Emission inventories can be used for either policy or scientific purposes. For policy purposes, emission inventories can be used to monitor the progress of environmental policy or to check compliance with conventions and protocols. For scientific purposes, emission inventories can be used as input into atmospheric dispersion models that are aimed at understanding the chemical and physical processes and the behaviour of air pollutants in the atmosphere. A strict separation between policy and scientific oriented emission inventories is not always possible. The usefulness of emission inventories for policy or science depends on the accuracy and the reliability of the inventories. There is uncertainty about an emission inventory when the accuracy and reliability of the emission estimates are not known. Proper use of emissions inventories requires an assessment of the uncertainties, including identification, qualification and quantification of the uncertainty. Although different methods for the assessment of uncertainty in emission inventories have been proposed, a systematic approach for identification, qualification and quantification of uncertainty does not exist. The objective of this thesis is to develop such a systematic approach for large-scale inventories. In order to meet this objective three research questions have been formulated:(i) What are the potential sources of uncertainty in emission inventories(ii) Which methods can be followed for the assessment of uncertainty(111) To what extent can uncertainty in emission inventories be identified, qualified or quantified.The methodology of emission inventory compilation typical for large-scale emission inventories has been illustrated by two emission inventories. In chapter 2, time series of past worldwide emission of anthropogenic trace gases for the period 1890 - 1990 are described. Chapter 3 presents projections for NOx emissions in Asia for the period 1990 -2020. The construction of these emission inventories was hampered by the lack of experimental data on the different sources of emission. As a result, the emissions were calculated on another scale than on which the emission processes occur in reality. The activity data and emission factors were based on extrapolation of existing information. Due to these aggregations and extrapolations, the emission inventories are inaccurate representations of the actual emissions.Chapter 4 describes the theoretical basis for our definitions of uncertainties, followed by a categorisation of uncertainties in emission inventories. It is argued that two types of uncertainty in emission inventories exist. Uncertainty about accuracy is the lack of knowledge about the sources and size of the inaccuracy. Uncertainty about reliability is the lack of knowledge about the degree to which the emission inventory is meeting user-specified quality criteria. These user-specified criteria depend on the purpose of the emission inventory. For scientific purposes the reliability is defined by the accuracy of the inventory. For policy purposes, quality criteria can be related to transparency, application of agreed upon methodologies or sometimes also to the assessment of accuracy. Uncertainty about reliability exists when either the accuracy of the emission inventory is not known or when the documentation of the inventory is inadequate and incomplete. Uncertainty about accuracy exists when the different sources of inaccuracy or the extent to which the inventory is inaccurate is not known. A categorisation of uncertainty about different sources of inaccuracy has been presented. Uncertainty about structural inaccuracy is the lack knowledge about the extent to which the structure of an emission inventory allows for an accurate calculation of the 'real' emission. Three causes for structural inaccuracy have been defined. These are aggregation error, incompleteness and mathematical formulation error. Uncertainty about input value inaccuracy is the lack of knowledge about the values of activity data and emission factors. Four causes for input value inaccuracy have been identified. These are extrapolation error, measurement error, unknown developments and reporting error.Uncertainty about reliability can be assessed through peer review. For the assessment of inaccuracy, a distinction is made between internal and external assessment of uncertainty. In an internal assessment, the methodology and information to construct an emission inventory form the basis for the assessment of inaccuracy. Based on review of available methodologies six methods for internal assessment are proposed: (i) qualitative discussion, (ii) data quality rating, (iii) calculation cheek and evaluation of mathematical formulation, (iv) expert judgement, (v) error propagation and (vi) importance analysis. In an external assessment, the difference between the emission inventory and external sources of information is used to identify, qualify or quantify inaccuracy in the emission inventory. Four methods can be used:(1)comparison with other emission inventories, (ii) comparison with (in)direct measurements, (iii) forward air quality modelling and (iv) inverse air quality modelling.Against this background we developed a systematic approach for the assessment of uncertainty in emission inventories. This framework, FRAULEIN (FRamework for the Assessment of Uncertainty in Large-scale Emission INventories) can be used to assess uncertainty about reliability and uncertainty about accuracy. It provides guidance for selection of the methods that can be used to identify, qualify or quantify different sources of uncertainty.Several methods included in the framework have been analysed in more detail to identify the advantages and disadvantages of these methods in practice. Chapter 5 presents the results of assessment of uncertainties in estimates of 1990 N20 emissions from agriculture in The Netherlands using the methods of error propagation and importance analysis. The results indicate that only a small number (three out of 23) of uncertain inventory parameters have large share in the inaccuracy of the emission inventory. These parameters include emission factors for indirect N20 emissions (EF5), the fraction of N leaching from agricultural soils (Fracleach) and the emission factor for direct soil emissions (EF1). Reducing the inaccuracy in the inventory should therefore focus on improved quantification of indirect emissions (based on EF5 and Fracleach) and direct soil emissions (EF1). From a methodological point of view, the results of the N20 case study show that quantification of input value inaccuracy through error propagation is influenced by the statisticalquantification interpretation of the available information in the IPCC Guidelines (default values, and uncertainty ranges of emission factors in particular). This result provides an indication that the extent to which inaccuracies can be assessed depends not only on the characteristics of the method used for the assessment but also on the available information on inventory parameters. Identification of inventory parameters having the largest share in the inaccuracy, on the other hand, was not influenced by the statistical interpretation of IPCC information.Chapter 6 describes the results of assessment of uncertainty in a European emission inventory of S02 in 1994 using forward air quality modelling and atmospheric measurements. The problem with this type of assessment is that it is not easy to pinpoint emission inventory inaccuracy as single cause of the deviation between measurements and model results. Inaccuracies exist in both the inventory, model and measurements. In the case study it has been analysed whether wind-direction-dependent differences between calculated and measured concentrations can be used to assess inaccuracies in emission inventories. The results indicate that in three regions within the study domain inaccuracy in the emission inventory is the most likely cause for the discrepancy between modelled and observed S02 concentrations. These regions are Sachsen/Brandenburg (Germany), Central England and the western part of the Russian Federation. In Sachsen/Brandenburg and Central England the spatial distribution of the emissions seems to be inaccurate while in the western part of the Russian Federation the total emission estimate seems to be inaccurate. We developed a relatively simple method to identify inventory inaccuracies based on differences between the air quality model and atmospheric measurements. However, it was also shown that the method is primarily a tool for identifying relatively inaccurate parts of the inventory. The method cannot be used to analyse causes of the inaccuracies, such as inaccurate structure or input values. Furthermore, it was concluded that the method is more a qualitative than a quantitative approach.There are three ways to use FRAULEIN in practice. First, in situations where the method for uncertainty assessment is prescribed, FRAULEIN clarifies the sources of uncertainty that can be identified, qualified or quantified. Second, if the objective of a study is to assess a specific source of uncertainty, FRAULEIN may serve as a guide for selection of the appropriate methods. Third, if the aim is to perform a full assessment of inaccuracy, FRAULEIN forms the basis of a four-step approach: (1) identification, qualification (2) and quantification (3) of the sources of inaccuracy, followed by evaluation to prioritise further research (4)
    • …
    corecore