23 research outputs found
Probabilistic Model-Based Safety Analysis
Model-based safety analysis approaches aim at finding critical failure
combinations by analysis of models of the whole system (i.e. software,
hardware, failure modes and environment). The advantage of these methods
compared to traditional approaches is that the analysis of the whole system
gives more precise results. Only few model-based approaches have been applied
to answer quantitative questions in safety analysis, often limited to analysis
of specific failure propagation models, limited types of failure modes or
without system dynamics and behavior, as direct quantitative analysis is uses
large amounts of computing resources. New achievements in the domain of
(probabilistic) model-checking now allow for overcoming this problem.
This paper shows how functional models based on synchronous parallel
semantics, which can be used for system design, implementation and qualitative
safety analysis, can be directly re-used for (model-based) quantitative safety
analysis. Accurate modeling of different types of probabilistic failure
occurrence is shown as well as accurate interpretation of the results of the
analysis. This allows for reliable and expressive assessment of the safety of a
system in early design stages
Model exploration and analysis for quantitative safety refinement in probabilistic B
The role played by counterexamples in standard system analysis is well known;
but less common is a notion of counterexample in probabilistic systems
refinement. In this paper we extend previous work using counterexamples to
inductive invariant properties of probabilistic systems, demonstrating how they
can be used to extend the technique of bounded model checking-style analysis
for the refinement of quantitative safety specifications in the probabilistic B
language. In particular, we show how the method can be adapted to cope with
refinements incorporating probabilistic loops. Finally, we demonstrate the
technique on pB models summarising a one-step refinement of a randomised
algorithm for finding the minimum cut of undirected graphs, and that for the
dependability analysis of a controller design.Comment: In Proceedings Refine 2011, arXiv:1106.348
Time of flight photoelectron momentum microscopy with 80 500 MHz photon sources electron optical pulse picker or bandpass pre filter
The small time gaps of synchrotron radiation in conventional multi bunch mode 100 500 MHz or laser based sources with high pulse rate 80 MHz are prohibitive for time of flight ToF based photoelectron spectroscopy. Detectors with time resolution in the 100 ps range yield only 20 100 resolved time slices within the small time gap. Here we present two techniques of implementing efficient ToF recording at sources with high repetition rate. A fast electron optical beam blanking unit with GHz bandwidth, integrated in a photoelectron momentum microscope, allows electron optical pulse picking with any desired repetition period. Aberration free momentum distributions have been recorded at reduced pulse periods of 5 MHz at MAX II and 1.25 MHz at BESSY II . The approach is compared with two alternative solutions a bandpass pre filter here a hemispherical analyzer or a parasitic four bunch island orbit pulse train, coexisting with the multi bunch pattern on the main orbit. Chopping in the time domain or bandpass pre selection in the energy domain can both enable efficient ToF spectroscopy and photoelectron momentum microscopy at 100 500 MHz synchrotrons, highly repetitive lasers or cavity enhanced high harmonic sources. The high photon flux of a UV laser 80 MHz, lt;1 meV bandwidth facilitates momentum microscopy with an energy resolution of 4.2 meV and an analyzed region of interest ROI down to lt;800 nm. In this novel approach to sub m ARPES the ROI is defined by a small field aperture in an intermediate Gaussian image, regardless of the size of the photon spo
OCL and graph transformations – a symbiotic alliance to alleviate the frame problem
Abstract. Many popular methodologies are influenced by Design by Contract. They recommend to specify the intended behavior of operations in an early phase of the software development life cycle. In practice, software developers use most often natural language to describe how the state of the system is supposed to change when the operation is executed. Formal contract specification languages are still rarely used because their semantics often mismatch the needs of software developers. Restrictive specification languages usually suffer from the ”frame problem”: It is hard to express which parts of the system state should remain unaffected when the specified operation is executed. Constructive specification languages, instead, suffer from the tendency to make specifications deterministic. This paper investigates how a combination of OCL and graph transformations can overcome the frame problem and can make constructive specifications less deterministic. Our new contract specification language is considerably more expressive than both pure OCL and pure graph transformations
Electron-phonon coupling at the Te hole pocket in TiTe
We have determined the quasiparticle dispersion for semimetallic 1T-TiTe at 20 K using time-of-flight momentum microscopy with tunable soft-x-ray excitation and high-resolution momentum microscopy with 6.4 eV ultraviolet excitation. In particular, we have studied the quasiparticle interactions of the electronic states of the Te 5p–hole pockets with high Fermi velocity near the point. Kinks in the otherwise parabolic dispersions suggest the onset of many-body interactions at a binding energy of about 30 meV. We attribute these kinks to electron-phonon coupling. Our study complements previously published results on the Ti 3d electron pockets near the M and L points. The electron-phonon coupling parameters (λ = 0.3–0.8) are in agreement with previously reported values. An apparently nonzero real part of the self-energy at the Fermi level for one of the two bands may be caused by residual charge-density-wave fluctuations
Transformation of Type Graphs with Inheritance for Ensuring Security in E-Government Networks
E-government services usually process large amounts of confidential data, but simultaneously they shall provide simple and userfriendly graphical interfaces. Therefore, security requirements for the communication between components have to be adhered in a very strict way. Hence it is of main interest that developers can analyze their modularized models of actual systems and that they can detect critical patterns. For this purpose, we present a general and formal framework for critical pattern detection and user-driven correction as well as possibilities for automatic analysis and verification of security requirements on the meta model level. The technique is based on the formal theory of graph transformation, which we extend to transformations of type graphs with inheritance within a type graph hierarchy in order to enable the specification of relevant security requirements in this scenario. The extended theory is shown to fulfil the conditions of a weak adhesive HLR category allowing us to transfer analysis techniques and results shown for this abstract framework of graph transformation. In particular, we discuss how confluence analysis and parallelization can be used to enable distributed critical pattern detection
A graphical specification of model transformations with triple graph grammars
Models and model transformations are the core concepts of OMG's MDA (TM) approach. Within this approach, most models are derived from the MOF and have a graph-based nature. In contrast, most of the current model transformations are specified textually. To enable a graphical specification of model transformation rules, this paper proposes to use triple graph grammars as declarative specification formalism. These triple graph grammars can be specified within the FUJABA tool and we argue that these rules can be more easily specified and they become more understandable and maintainable. To show the practicability of our approach, we present how to generate Tefkat rules from triple graph grammar rules, which helps to integrate triple graph grammars with a state of a art model transformation tool and shows the expressiveness of the concept