102,854 research outputs found
Process algebra for performance evaluation
This paper surveys the theoretical developments in the field of stochastic process algebras, process algebras where action occurrences may be subject to a delay that is determined by a random variable. A huge class of resource-sharing systems – like large-scale computers, client–server architectures, networks – can accurately be described using such stochastic specification formalisms. The main emphasis of this paper is the treatment of operational semantics, notions of equivalence, and (sound and complete) axiomatisations of these equivalences for different types of Markovian process algebras, where delays are governed by exponential distributions. Starting from a simple actionless algebra for describing time-homogeneous continuous-time Markov chains, we consider the integration of actions and random delays both as a single entity (like in known Markovian process algebras like TIPP, PEPA and EMPA) and as separate entities (like in the timed process algebras timed CSP and TCCS). In total we consider four related calculi and investigate their relationship to existing Markovian process algebras. We also briefly indicate how one can profit from the separation of time and actions when incorporating more general, non-Markovian distributions
Analysis of Timed and Long-Run Objectives for Markov Automata
Markov automata (MAs) extend labelled transition systems with random delays
and probabilistic branching. Action-labelled transitions are instantaneous and
yield a distribution over states, whereas timed transitions impose a random
delay governed by an exponential distribution. MAs are thus a nondeterministic
variation of continuous-time Markov chains. MAs are compositional and are used
to provide a semantics for engineering frameworks such as (dynamic) fault
trees, (generalised) stochastic Petri nets, and the Architecture Analysis &
Design Language (AADL). This paper considers the quantitative analysis of MAs.
We consider three objectives: expected time, long-run average, and timed
(interval) reachability. Expected time objectives focus on determining the
minimal (or maximal) expected time to reach a set of states. Long-run
objectives determine the fraction of time to be in a set of states when
considering an infinite time horizon. Timed reachability objectives are about
computing the probability to reach a set of states within a given time
interval. This paper presents the foundations and details of the algorithms and
their correctness proofs. We report on several case studies conducted using a
prototypical tool implementation of the algorithms, driven by the MAPA
modelling language for efficiently generating MAs.Comment: arXiv admin note: substantial text overlap with arXiv:1305.705
Benefits of spatio-temporal modelling for short term wind power forecasting at both individual and aggregated levels
The share of wind energy in total installed power capacity has grown rapidly
in recent years around the world. Producing accurate and reliable forecasts of
wind power production, together with a quantification of the uncertainty, is
essential to optimally integrate wind energy into power systems. We build
spatio-temporal models for wind power generation and obtain full probabilistic
forecasts from 15 minutes to 5 hours ahead. Detailed analysis of the forecast
performances on the individual wind farms and aggregated wind power are
provided. We show that it is possible to improve the results of forecasting
aggregated wind power by utilizing spatio-temporal correlations among
individual wind farms. Furthermore, spatio-temporal models have the advantage
of being able to produce spatially out-of-sample forecasts. We evaluate the
predictions on a data set from wind farms in western Denmark and compare the
spatio-temporal model with an autoregressive model containing a common
autoregressive parameter for all wind farms, identifying the specific cases
when it is important to have a spatio-temporal model instead of a temporal one.
This case study demonstrates that it is possible to obtain fast and accurate
forecasts of wind power generation at wind farms where data is available, but
also at a larger portfolio including wind farms at new locations. The results
and the methodologies are relevant for wind power forecasts across the globe as
well as for spatial-temporal modelling in general
Research on the reasoning, teaching and learning of probability and uncertainty
In this editorial, we set out the aims in the call to publish papers on informal statistical inference, randomness, modelling and risk. We discuss how the papers published in this issue have responded to those aims. In particular, we note how the nine papers contribute to some of the major debates in mathematics and statistics education, often taking contrasting positions. Such debates range across: (1) whether knowledge is fractured or takes the form of mental models; (2) heuristic or intuitive thinking versus operational thinking as for example in dual process theory; (3) the role of different epistemic resources, such as perceptions, modelling, imagery, in the development of probabilistic reasoning; (4) how design and situation impact upon probabilistic learning
Micromechanical investigation of the influence of defects in high cycle fatigue
This study aims to analyse the influence of geometrical defects (notches and holes) on the high cycle fatigue behaviour of an electrolytic copper based on finite element simulations of 2D polycrystalline aggregates. In order to investigate the role of each source of anisotropy on the mechanical response at the grain scale, three different material constitutive models are assigned successively to the grains: isotropic elasticity, cubic elasticity and crystal plasticity in addition to the cubic elasticity. The significant influence of the elastic anisotropy on the mechanical response of the grains is highlighted. When considering smooth microstructures, the crystal plasticity have has a slight effect in comparison with the cubic elasticity influence. However, in the case of notched microstructures, it has been shown that the influence of the plasticity is no more negligible. Finally, the predictions of three fatigue criteria are analysed. Their ability to predict the defect size effect on the fatigue strength is evaluated thanks to a comparison with experimental data from the literature
Probabilistic model checking of complex biological pathways
Probabilistic model checking is a formal verification technique that has been successfully applied to the analysis of systems from a broad range of domains, including security and communication protocols, distributed algorithms and power management. In this paper we illustrate its applicability to a complex biological system: the FGF (Fibroblast Growth Factor) signalling pathway. We give a detailed description of how this case study can be modelled in the probabilistic model checker PRISM, discussing some of the issues that arise in doing so, and show how we can thus examine a rich selection of quantitative properties of this model. We present experimental results for the case study under several different scenarios and provide a detailed analysis, illustrating how this approach can be used to yield a better understanding of the dynamics of the pathway
- …