771 research outputs found
Rare event simulation for dynamic fault trees
Fault trees (FT) are a popular industrial method for reliability engineering, for which Monte Carlo simulation is an important technique to estimate common dependability metrics, such as the system reliability and availability. A severe drawback of Monte Carlo simulation is that the number of simulations required to obtain accurate estimations grows extremely large in the presence of rare events, i.e., events whose probability of occurrence is very low, which typically holds for failures in highly reliable systems. This paper presents a novel method for rare event simulation of dynamic fault trees with complex repairs that requires only a modest number of simulations, while retaining statistically justified confidence intervals. Our method exploits the importance sampling technique for rare event simulation, together with a compositional state space generation method for dynamic fault trees. We demonstrate our approach using two parameterized sets of case studies, showing that our method can handle fault trees that could not be evaluated with either existing analytical techniques, nor with standard simulation techniques
Recommended from our members
Massively Parallel QCD
The theory of the strong nuclear force, Quantum Chromodynamics (QCD), can be numerically simulated from first principles on massively-parallel supercomputers using the method of Lattice Gauge Theory. We describe the special programming requirements of lattice QCD (LQCD) as well as the optimal supercomputer hardware architectures that it suggests. We demonstrate these methods on the BlueGene massively-parallel supercomputer and argue that LQCD and the BlueGene architecture are a natural match. This can be traced to the simple fact that LQCD is a regular lattice discretization of space into lattice sites while the BlueGene supercomputer is a discretization of space into compute nodes, and that both are constrained by requirements of locality. This simple relation is both technologically important and theoretically intriguing. The main result of this paper is the speedup of LQCD using up to 131,072 CPUs on the largest BlueGene/L supercomputer. The speedup is perfect with sustained performance of about 20% of peak. This corresponds to a maximum of 70.5 sustained TFlop/s. At these speeds LQCD and BlueGene are poised to produce the next generation of strong interaction physics theoretical results
Virtual Reality Based Simulation of Hysteroscopic Interventions
Virtual reality based simulation is an appealing option to supplement traditional clinical education. However, the formal integration of training simulators into the medical curriculum is still lacking. Especially, the lack of a reasonable level of realism supposedly hinders the widespread use of this technology. Therefore, we try to tackle this situation with a reference surgical simulator of the highest possible fidelity for procedural training. This overview describes all elements that have been combined into our training system as well as first results of simulator validation. Our framework allows the rehearsal of several aspects of hysteroscopy—for instance, correct fluid management, handling of excessive bleeding, appropriate removal of intrauterine tumors, or the use of the surgical instrument
Spatial extremes of wildfire sizes: Bayesian hieralquical models for extremes
In Portugal, due to the combination of climatological and ecological
factors, large wildfires are a constant threat and due to their economic impact, a big
policy issue. In order to organize efficient fire fighting capacity and resource management,
correct quantification of the risk of large wildfires are needed. In this paper,
we quantify the regional risk of large wildfire sizes, by fitting a Generalized Pareto
distribution to excesses over a suitably chosen high threshold. Spatio-temporal variations
are introduced into the model through model parameters with suitably chosen
link functions. The inference on these models are carried using Bayesian Hierarchical
Models and Markov chain Monte Carlo methods
A fresh look at instrumentation - an introduction
The theme of "instrumentation between science, state and industry" does not square well with the venerable discourse which opposes "science" and "technology" in social studies of science. In this discourse, "technology" stands for the contrary of "science"; it represents the practical uses of science in society at large and is understood as separate from the somehow autonomous sphere of "science" (Layton 1971a). This vocabulary, widespread as it may be, is not very useful for our purposes, and, for that matter, for any inquiry into the role of instruments. Technology, in the sense of technical instruments and the knowledge systems that go with them, pervades all societal systems. There are technologies of science, of industry, of state, and so forth, and it would be ill-advised to assume that, in the end, they all flow out of "science." But even if the crude opposition of science and technology has little analytic value, the dual problem remains: how to effectively conceive the dynamic relationship between scientific spheres and other societal spheres, and how to conceive the role that technological matters play in this relationship
- …