94 research outputs found
Concurrency meets probability: theory and practice (abstract)
Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between probabilistic bisimulation [16] and Markov chain lumping [15] have been found. A plethora of probabilistic concurrency models has emerged [19]. Over the years, the focus shifted from covering discrete to treating continuous stochastic phenomena [12, 13]. We argue that both aspects can be elegantly combined with non-determinism, yielding the Markov automata model [8]. This model has nice theoretical characteristics. It is closed under parallel composition and hiding. Conservative extensions of (bi)simulation are congruences [8, 4]. It has a simple process algebraic counterpart [20]. On-the-fly partial-order reduction yields substantial state-space reductions [21]. Their quantitative analysis largely depends on (efficient) linear programming and scales well [11]. More importantly though: Markov automata serve an important practical need. They are the obvious choice for providing semantics to the Architecture Analysis & Design Language (AADL [9]), an industry standard for the automotive and aerospace domain. As experienced in several ESA projects, this holds in particular for the AADL annex dealing with error models [3]. They provide a compositional semantics to dynamic fault trees [6], a key model for reliability engineering [2]. Finally, they give a natural semantics to every generalised stochastic Petri net (GSPN [17]), a prominent model in performance analysis. This conservatively extends the existing GSPN semantics that is restricted to well-defined" nets, i.e., nets without non-determinism [7]. Powerful software tools support this and incorporate ecient analysis and minimisation algorithms [11]. This substantiates our take-home message: Markov automata bridge the gap between an elegant theory and practical engineering needs
Code Generation = A* + BURS
A system called BURS that is based on term rewrite systems and a search algorithm A* are combined to produce a code generator that generates optimal code. The theory underlying BURS is re-developed, formalised and explained in this work. The search algorithm uses a cost heuristic that is derived from the termrewrite system to direct the search. The advantage of using a search algorithm is that we need to compute only those costs that may be part of an optimal rewrite sequence
Analysis of Internal Boundaries and Transition Regions in Geophysical Systems with Advanced Processing Techniques
This thesis examines the utility of the Rényi entropy (RE), a measure of the complexity of probability density functions, as a tool for finding physically meaningful patterns in geophysical data. Initially, the RE is applied to observational data of long-lived atmospheric tracers in order to analyse the dynamics of stratospheric transitions regions associated with barriers to horizontal mixing. Its wider applicability is investigated by testing the RE as a method for highlighting internal boundaries in snow and ice from ground penetrating radar (GPR) recordings. High-resolution 500 MHz GPR soundings of dry snow were acquired at several sites near Scott Base, Antarctica, in 2008 and 2009, with the aim of using the RE to facilitate the identification and tracking of subsurface layers to extrapolate point measurements of accumulation from snow pits and firn cores to larger areas.
The atmospheric analysis focuses on applying the RE to observational tracer data from the EOS-MLS satellite instrument. Nitrous oxide (N2O) is shown to exhibit subtropical RE maxima in both hemispheres. These peaks are a measure of the tracer gradients that mark the transition between the tropics and the mid-latitudes in the stratosphere, also referred to as the edges of the tropical pipe. The RE maxima are shown to be located closer to the equator in winter than in summer. This agrees well with the expected behaviour of the tropical pipe edges and is similar to results reported by other studies. Compared to other stratospheric mixing metrics, the RE has the advantage that it is easy to calculate as it does not, for example, require conversion to equivalent latitude and does not rely on dynamical information such as wind fields.
The RE analysis also reveals occasional sudden poleward shifts of the southern hemisphere tropical pipe edge during austral winter which are accompanied by increased mid-latitude N2O levels. These events are investigated in more detail by creating daily high-resolution N2O maps using a two-dimensional trajectory model and MERRA reanalysis winds to advect N2O observations forwards and backwards in time on isentropic surfaces. With the aid of this ‘domain filling’ technique it is illustrated that the increase in southern hemisphere mid-latitude N2O during austral winter is probably the result of the cumulative effect of several large-scale, episodic leaks of N2O-rich air from the tropical pipe. A comparison with the global distribution of potential vorticity strongly suggests that irreversible mixing related to planetary wave breaking is the cause of the leak events. Between 2004 and 2011 the large-scale leaks are shown to occur approximately every second year and a connection to the equatorial quasi-biennial oscillation is found to be likely, though this cannot be established conclusively due to the relatively short data set.
Identification and tracking of subsurface boundaries, such as ice layers in snow or the bedrock of a glacier, is the focus of the cryospheric part of this project. The utility of the RE for detecting amplitude gradients associated with reflections in GPR recordings is initially tested on a 25 MHz sounding of an Antarctic glacier. The results show distinct regions of increased RE values that allow identification of the glacial bedrock along large parts of the profile. Due to the low computational requirements, the RE is found to be an effective pseudo gain function for initial analysis of GPR data in the field. While other gain functions often have to be tuned to give a good contrast between reflections and background noise over the whole vertical range of a profile, the RE tends to assign all detectable amplitude gradients a similar (high) value, resulting in a clear contrast between reflections and background scattering. Additionally, theoretical considerations allow the definition of a ‘standard’ data window size with which the RE can be applied to recordings made by most pulsed GPR systems and centre frequencies. This is confirmed by tests with higher frequency recordings (50 and 500 MHz) acquired on the McMurdo Ice Shelf. However, these also reveal that the RE processing is less reliable for identifying more closely spaced reflections from internal layers in dry snow.
In order to complete the intended high-resolution analysis of accumulation patterns by tracking internal snow layers in the 500 MHz data from two test sites, a different processing approach is developed. Using an estimate of the emitted waveform from direct measurement, deterministic deconvolution via the Fourier domain is applied to the high-resolution GPR data. This reveals unambiguous reflection horizons which can be observed in repeat measurements made one year apart. Point measurements of average accumulation from snow pits and firn cores are extrapolated to larger areas by identifying and tracking a dateable dust layer horizon in the radargrams. Furthermore, it is shown that annual compaction rates of snow can be estimated by tracking several internal reflection horizons along the deconvolved radar profiles and calculating the average change in separation of horizon pairs from one year to the next. The technique is complementary to point measurements from other studies and the derived compaction rates agree well with published values and theoretical estimates
Causal ambiguity and partial orders in event structures
Event structure models often have some constraint which ensures that for each\ud
system run it is clear what are the causal predecessors of an event (i.e. there is no causal ambiguity). In this contribution we study what happens if we remove\ud
such constraints. We define five different partial order semantics that are intentional in the sense that they refer to syntactic aspects of the model. We also define an observational partial order semantics, that derives a partial order from just the event traces. It appears that this corresponds to the so-called early intentional semantics; the other intentional semantics cannot be observationally characterized. We study the equivalences induced by the different partial order definitions, and their interrelations
Evaluation and Improvement of Procurement Process with Data Analytics
This paper presents a compositional framework for the modeling of interactive continuous-time Markov chains with time-dependent rates, a subclass of communicating piecewise deterministic Markov processes. A poly-time algorithm is presented for computing the coarsest quotient under strong bisimulation for rate functions that are either piecewise uniform or (piecewise) polynomial. Strong as well as weak bisimulation are shown to be congruence relations for the compositional framework, thus allowing component-wise minimization. In addition, a new characterization of transient probabilities in time-inhomogeneous Markov chains with piecewise uniform rates is provided
Towards a logic for performance and mobility
Klaim is an experimental language designed for modeling and programming distributed systems composed of mobile components where distribution awareness and dynamic system architecture configuration are key issues. StocKlaim [R. De Nicola, D. Latella, and M. Massink. Formal modeling and quantitative analysis of KLAIM-based mobile systems. In ACM Symposium on Applied Computing (SAC). ACM Press, 2005. Also available as Technical Report 2004-TR-25; CNR/ISTI, 2004] is a Markovian extension of the core subset of Klaim which includes process distribution, process mobility, asynchronous communication, and site creation. In this paper, MoSL, a temporal logic for StocKlaim is proposed which addresses and integrates the issues of distribution awareness and mobility and those concerning stochastic behaviour of systems. The satisfiability relation is formally defined over labelled Markov chains. A large fragment of the proposed logic can be translated to action-based CSL for which efficient model-checkers exist. This way, such model-checkers can be used for the verification of StocKlaim models against MoSL properties. An example application is provided in the present paper
- …