25 research outputs found

    A Power Cap Oriented Time Warp Architecture

    Get PDF
    Controlling power usage has become a core objective in modern computing platforms. In this article we present an innovative Time Warp architecture oriented to efficiently run parallel simulations under a power cap. Our architectural organization considers power usage as a foundational design principle, as opposed to classical power-unaware Time Warp design. We provide early experimental results showing the potential of our proposal

    A fine-grain time-sharing Time Warp system

    Get PDF
    Although Parallel Discrete Event Simulation (PDES) platforms relying on the Time Warp (optimistic) synchronization protocol already allow for exploiting parallelism, several techniques have been proposed to further favor performance. Among them we can mention optimized approaches for state restore, as well as techniques for load balancing or (dynamically) controlling the speculation degree, the latter being specifically targeted at reducing the incidence of causality errors leading to waste of computation. However, in state of the art Time Warp systems, events’ processing is not preemptable, which may prevent the possibility to promptly react to the injection of higher priority (say lower timestamp) events. Delaying the processing of these events may, in turn, give rise to higher incidence of incorrect speculation. In this article we present the design and realization of a fine-grain time-sharing Time Warp system, to be run on multi-core Linux machines, which makes systematic use of event preemption in order to dynamically reassign the CPU to higher priority events/tasks. Our proposal is based on a truly dual mode execution, application vs platform, which includes a timer-interrupt based support for bringing control back to platform mode for possible CPU reassignment according to very fine grain periods. The latter facility is offered by an ad-hoc timer-interrupt management module for Linux, which we release, together with the overall time-sharing support, within the open source ROOT-Sim platform. An experimental assessment based on the classical PHOLD benchmark and two real world models is presented, which shows how our proposal effectively leads to the reduction of the incidence of causality errors, as compared to traditional Time Warp, especially when running with higher degrees of parallelism

    Scheduling of a Cyber-Physical System Simulation

    Get PDF
    The work carried out in this Ph.D. thesis is part of a broader effort to automate industrial simulation systems. In the aeronautics industry, and more especially within Airbus, the historical application of simulation is pilot training. There are also more recent uses in the design of systems, as well as in the integration of these systems. These latter applications require a very high degree of representativeness, where historically the most important factor has been the pilot’s feeling. Systems are now divided into several subsystems that are designed, implemented and validated independently, in order to maintain their control despite the increase in their complexity, and the reduction in time-to-market. Airbus already has expertise in the simulation of these subsystems, as well as their integration into a simulation. This expertise is empirical; simulation specialists use the previous integrations schedulings and adapt it to a new integration. This is a process that can sometimes be time-consuming and can introduce errors. The current trends in the industry are towards flexible production methods, integration of logistics tools for tracking, use of simulation tools in production, as well as resources optimization. Products are increasingly iterations of older, improved products, and tests and simulations are increasingly integrated into their life cycles. Working empirically in an industry that requires flexibility is a constraint, and nowadays it is essential to facilitate the modification of simulations. The problem is, therefore, to set up methods and tools allowing a priori to generate representative simulation schedules. In order to solve this problem, we have developed a method to describe the elements of a simulation, as well as how this simulation can be executed, and functions to generate schedules. Subsequently, we implemented a tool to automate the scheduling search, based on heuristics. Finally, we tested and verified our method and tools in academic and industrial case studies

    A Hybrid Modelling Framework for Real-time Decision-support for Urgent and Emergency Healthcare

    Get PDF
    In healthcare, opportunities to use real-time data to support quick and effective decision-making are expanding rapidly, as data increases in volume, velocity and variety. In parallel, the need for short-term decision-support to improve system resilience is increasingly relevant, with the recent COVID-19 crisis underlining the pressure that our healthcare services are under to deliver safe, effective, quality care in the face of rapidly-shifting parameters. A real-time hybrid model (HM) which combines real-time data, predictions, and simulation, has the potential to support short-term decision-making in healthcare. Considering decision-making as a consequence of situation awareness focuses the HM on what information is needed where, when, how, and by whom with a view toward sustained implementation. However the articulation between real-time decision-support tools and a sociotechnical approach to their development and implementation is currently lacking in the literature. Having identified the need for a conceptual framework to support the development of real-time HMs for short-term decision-support, this research proposed and tested the Integrated Hybrid Analytics Framework (IHAF) through an examination of the stages of a Design Science methodology and insights from the literature examining decision-making in dynamic, sociotechnical systems, data analytics, and simulation. Informed by IHAF, a HM was developed using real-time Emergency Department data, time-series forecasting, and discrete-event simulation. The application started with patient questionnaires to support problem definition and to act as a formative evaluation, and was subsequently evaluated using staff interviews. Evaluation of the application found multiple examples where the objectives of people or sub-systems are not aligned, resulting in inefficiencies and other quality problems, which are characteristic of complex adaptive sociotechnical systems. Synthesis of the literature, the formative evaluation, and the final evaluation found significant themes which can act as antecedents or evaluation criteria for future real-time HM studies in sociotechnical systems, in particular in healthcare. The generic utility of IHAF is emphasised for supporting future applications in similar domains

    Towards Bayesian Model-Based Demography

    Get PDF
    This open access book presents a ground-breaking approach to developing micro-foundations for demography and migration studies. It offers a unique and novel methodology for creating empirically grounded agent-based models of international migration – one of the most uncertain population processes and a top-priority policy area. The book discusses in detail the process of building a simulation model of migration, based on a population of intelligent, cognitive agents, their networks and institutions, all interacting with one another. The proposed model-based approach integrates behavioural and social theory with formal modelling, by embedding the interdisciplinary modelling process within a wider inductive framework based on the Bayesian statistical reasoning. Principles of uncertainty quantification are used to devise innovative computer-based simulations, and to learn about modelling the simulated individuals and the way they make decisions. The identified knowledge gaps are subsequently filled with information from dedicated laboratory experiments on cognitive aspects of human decision-making under uncertainty. In this way, the models are built iteratively, from the bottom up, filling an important epistemological gap in migration studies, and social sciences more broadly

    Towards Bayesian Model-Based Demography

    Get PDF
    This open access book presents a ground-breaking approach to developing micro-foundations for demography and migration studies. It offers a unique and novel methodology for creating empirically grounded agent-based models of international migration – one of the most uncertain population processes and a top-priority policy area. The book discusses in detail the process of building a simulation model of migration, based on a population of intelligent, cognitive agents, their networks and institutions, all interacting with one another. The proposed model-based approach integrates behavioural and social theory with formal modelling, by embedding the interdisciplinary modelling process within a wider inductive framework based on the Bayesian statistical reasoning. Principles of uncertainty quantification are used to devise innovative computer-based simulations, and to learn about modelling the simulated individuals and the way they make decisions. The identified knowledge gaps are subsequently filled with information from dedicated laboratory experiments on cognitive aspects of human decision-making under uncertainty. In this way, the models are built iteratively, from the bottom up, filling an important epistemological gap in migration studies, and social sciences more broadly

    SIMULATION OF A MULTIPROCESSOR COMPUTER SYSTEM

    Get PDF
    The introduction of computers and software engineering in telephone switching systems has dictated the need for powerful design aids for such complex systems. Among these design aids simulators - real-time environment simulators and flat-level simulators - have been found particularly useful in stored program controlled switching systems design and evaluation. However, both types of simulators suffer from certain disadvantages. An alternative methodology to the simulation of stored program controlled switching systems is proposed in this research. The methodology is based on the development of a process-based multilevel hierarchically structured software simulator. This methodology eliminates the disadvantages of environment and flat-level simulators. It enables the modelling of the system in a 1 to 1 transformation process retaining the sub-systems interfaces and, hence, making it easier to see the resemblance between the model and modelled system and to incorporate design modifications and/or additions in the simulator. This methodology has been applied in building a simulation package for the System X family of exchanges. The Processor Utility Sub-system used to control the exchanges is first simulated, verified and validated. The application sub-systems models are then added one level higher_, resulting in an open-ended simulator having sub-systems models at different levels of detail and capable of simulating any member of the System X family of exchanges. The viability of the methodology is demonstrated by conducting experiments to tune the real-time operating system and by simulating a particular exchange - The Digital Main Network Switching Centre - in order to determine its performance characteristics.The General Electric Company Ltd, GEC Hirst Research Cent, Wemble

    Improving reproducibility and reuse of modelling results in the life sciences

    Get PDF
    Research results are complex and include a variety of heterogeneous data. This entails major computational challenges to (i) to manage simulation studies, (ii) to ensure model exchangeability, stability and validity, and (iii) to foster communication between partners. I describe techniques to improve the reproducibility and reuse of modelling results. First, I introduce a method to characterise differences in computational models. Second, I present approaches to obtain shareable and reproducible research results. Altogether, my methods and tools foster exchange and reuse of modelling results.Die verteilte Entwicklung von komplexen Simulationsstudien birgt eine große Zahl an informationstechnischen Herausforderungen: (i) Modelle müssen verwaltet werden; (ii) Reproduzierbarkeit, Stabilität und Gültigkeit von Ergebnissen muss sichergestellt werden; und (iii) die Kommunikation zwischen Partnern muss verbessert werden. Ich stelle Techniken vor, um die Reproduzierbarkeit und Wiederverwendbarkeit von Modellierungsergebnissen zu verbessern. Meine Implementierungen wurden erfolgreich in internationalen Anwendungen integriert und fördern das Teilen von wissenschaftlichen Ergebnissen
    corecore