12 research outputs found

    On improving the performance of optimistic distributed simulations

    No full text
    This report investigates means of improving the performance of optimistic distributed simulations without affecting the simulation accuracy. We argue that existing clustering algorithms are not adequate for application in distributed simulations, and outline some characteristics of an ideal algorithm that could be applied in this field. This report is structured as follows. We start by introducing the area of distributed simulation. Following a comparison of the dominant protocols used in distributed simulation, we elaborate on the current approaches of improving the simulation performance, using computation efficient techniques, exploiting the hardware configuration of processors, optimizations that can be derived from the simulation scenario, etc. We introduce the core characteristics of clustering approaches and argue that these cannot be applied in real-life distributed simulation problems. We present a typical distributed simulation setting and elaborate on the reasons that existing clustering approaches are not expected to improve the performance of a distributed simulation. We introduce a prototype distributed simulation platform that has been developed in the scope of this research, focusing on the area of emergency response and specifically building evacuation. We continue by outlining our current work on this issue, and finally, we end this report by outlining next actions which could be made in this field

    Développement de concepts et outils d’aide à la décision pour l’optimisation via simulation: Intégration des métaheuristiques au formalisme DEVS

    Get PDF
    In the world in which we live the efficient needs are increasing in various fields like industrymedecine and environnemtale monitoring. To meet this needs, many optimization methods nammed« metaheuristics » have been created over the last forty years. They are based on probabilistic andrandom reasoning and allow user to solve problems for which conventional methods can not be usedin acceptable computing times. Victim of their methods succes, the developers of the methods have toanswer to several questions : « How can the fitness of solutions be assessed ? », « How to use thesame method for several projects without change the code? », « What method will we choose for aspecific problem ? », « How to parametrize algorithms ? ».To deal with this problem, we have developed a set of concepts and tools. They have beendeveloped in the context of modeling and simulation of discrete event systems with DEVS formalism.The aims pursued are : allow temporized and spacialized optimization of existing DEVS models,improve the optimization process efficiency (quality of solutions, computing time). Modeling andsimulation are used to propose parameters toward the input of problem to optimize. This one generateresults used to improve the next proposed solutions. In order to combine optimization and simulation,we propose to represent the optimization method as models which can be easily interconnected andsimulated. We focus on consistency of exchanges between optimization models and problem models.Our approach allows early stopping of useless simulations and reduce the computing time as a result.Modeling optimization methods in DEVS formalism also allows to autimatically choose theoptimization algorithm and its parameters. Various algorithms and parameters can be used for thesame problem during optimization process at different steps. This changes are influenced by collectedresults of problem simulation. They lead on a self adaptation to the visible or/and hidden features ofthe studied problem.Our models architecture has been tested on three different problems : parametric optimizationof mathematical functions, spacialized optimization of a sensor network deployment, temporizedoptimization of a medical treatment. Genericity of our concepts and scalability of our modelsunderline the usabily of proposed tool. Concerning performance, simulation breaks and dynamicoptimization have obtained higher quality solutions in a short time.Nous vivons dans un monde où le besoin d’efficacité s’impose de plus en plus. Ce besoin s’exprime dans différents domaines, allant de l’industrie à la médecine en passant par la surveillance environnementale. Engendrées par cette demande, de nombreuses méthodes d’optimisation « modernes » également appelées « métaheuristiques » sont apparues ces quarante dernières années. Ces méthodes se basent sur des raisonnements probabilistes et aléatoires et permettent la résolution de problèmes pour lesquels les méthodes d’optimisation « classiques » également appelées « méthodes déterministes » ne permettent pas l’obtention de résultats dans des temps raisonnables. Victimes du succès de ces méthodes, leurs concepteurs doivent aujourd’hui plus que jamais répondre à de nombreuses problématiques qui restent en suspens : « Comment évaluer de manière fiable et rapide les solutions proposées ? », « Quelle(s) méthode(s) choisir pour le problème étudié ? », « Comment paramétrer la méthode utilisée ? », « Comment utiliser une même méthode sur différents problème sans avoir à la modifier ? ». Pour répondre à ces différentes questions, nous avons développé un ensemble de concepts et outils. Ceux-ci ont été réalisés dans le cadre de la modélisation et la simulation de systèmes à évènements discrets avec le formalisme DEVS. Ce choix a été motivé par deux objectifs : permettre l’optimisation temporelle et spatiale de modèles DEVS existants et améliorer les performances du processus d’optimisation (qualité des solutions proposées, temps de calcul). La modélisation et la simulation de l’optimisation permettent de générer directement des propositions de paramètres sur les entrées du modèle à optimiser. Ce modèle, quant à lui, génère des résultats utiles à la progression de l’optimisation. Pour réaliser ce couplage entre optimisation et simulation, nous proposons l’intégration des méthodes d’optimisation sous la forme de modèles simulables et facilement interconnectables. Notre intégration se concentre donc sur la cohérence des échanges entre les modèles dédiés à l’optimisation et les modèles dédiés à la représentation du problème. Elle permet également l’arrêt anticipé de certaines simulations inutiles afin de réduire au maximum la durée de l’optimisation. La représentation des méthodes d’optimisation sous formes de modèles simulables apporte également un élément de réponse dans le choix et le paramétrage des algorithmes. Grace à l’usage de la simulation, différents algorithmes et paramètres peuvent être utilisés pour un même processus d’optimisation. Ces changements sont également influencés par les résultats observés et permettent une adaptation automatique de l’optimisation aux spécificités connues et/ou cachées du problème étudié ainsi qu’à ses différentes étapes de résolution.L’architecture de modèle que nous proposons a été validée sur trois problèmes distincts : l’optimisation de paramètres pour des fonctions mathématiques, l’optimisation spatialisée d’un déploiement de réseau de capteurs sans fil, l’optimisation temporisée de traitements médicaux. La généricité de nos concepts et la modularité de nos modèles ont permis de mettre en avant la facilité d’utilisation de notre outil. Au niveau des performances, l’interruption de certaines simulations ainsi que dynamisme de l’optimisation ont permis l’obtention de solutions de qualité supérieure dans des temps inférieurs

    Monitoring, Modeling, and Hybrid Simulation An Integrated Bayesian-based Approach to High-fidelity Fragility Analysis

    Get PDF
    Fragility functions are one of the key technical ingredients in seismic risk assessment. The derivation of fragility functions has been extensively studied in the past; however, large uncertainties still exist, mainly due to limited collaboration between the interdependent components involved in the course of fragility estimation. This research aims to develop a systematic Bayesian-based framework to estimate high-fidelity fragility functions by integrating monitoring, modeling, and hybrid simulation, with the final goal of improving the accuracy of seismic risk assessment to support both pre- and post-disaster decision-making. In particular, this research addresses the following five aspects of the problem: (1) monitoring with wireless smart sensor networks to facilitate efficient and accurate pre- and post-disaster data collection, (2) new modeling techniques including innovative system identification strategies and model updating to enable accurate structural modeling, (3) hybrid simulation as an advanced numerical experimental simulation tool to generate highly realistic and accurate response data for structures subject to earthquakes, (4) Bayesian-updating as a systematic way of incorporating hybrid simulation data to generate composite fragility functions with higher fidelity, and 5) the implementation of an integrated fragility analysis approach as a part of a seismic risk assessment framework. This research not only delivers an extensible and scalable framework for high fidelity fragility analysis and reliable seismic risk assessment, but also provides advances in wireless smart sensor networks, system identification, and pseudo-dynamic testing in civil engineering applications.Financial support for this research was provided in part by the National Science Foundation under NSF Grants No. CMS-060043, CMMI-0724172, CMMI-0928886, and CNS-1035573.Ope

    Modelling and Design of Resilient Networks under Challenges

    Get PDF
    Communication networks, in particular the Internet, face a variety of challenges that can disrupt our daily lives resulting in the loss of human lives and significant financial costs in the worst cases. We define challenges as external events that trigger faults that eventually result in service failures. Understanding these challenges accordingly is essential for improvement of the current networks and for designing Future Internet architectures. This dissertation presents a taxonomy of challenges that can help evaluate design choices for the current and Future Internet. Graph models to analyse critical infrastructures are examined and a multilevel graph model is developed to study interdependencies between different networks. Furthermore, graph-theoretic heuristic optimisation algorithms are developed. These heuristic algorithms add links to increase the resilience of networks in the least costly manner and they are computationally less expensive than an exhaustive search algorithm. The performance of networks under random failures, targeted attacks, and correlated area-based challenges are evaluated by the challenge simulation module that we developed. The GpENI Future Internet testbed is used to conduct experiments to evaluate the performance of the heuristic algorithms developed

    Proceedings, MSVSCC 2017

    Get PDF
    Proceedings of the 11th Annual Modeling, Simulation & Visualization Student Capstone Conference held on April 20, 2017 at VMASC in Suffolk, Virginia. 211 pp

    Towards Simulation and Emulation of Large-Scale Computer Networks

    Get PDF
    Developing analytical models that can accurately describe behaviors of Internet-scale networks is difficult. This is due, in part, to the heterogeneous structure, immense size and rapidly changing properties of today\u27s networks. The lack of analytical models makes large-scale network simulation an indispensable tool for studying immense networks. However, large-scale network simulation has not been commonly used to study networks of Internet-scale. This can be attributed to three factors: 1) current large-scale network simulators are geared towards simulation research and not network research, 2) the memory required to execute an Internet-scale model is exorbitant, and 3) large-scale network models are difficult to validate. This dissertation tackles each of these problems. First, this work presents a method for automatically enabling real-time interaction, monitoring, and control of large-scale network models. Network researchers need tools that allow them to focus on creating realistic models and conducting experiments. However, this should not increase the complexity of developing a large-scale network simulator. This work presents a systematic approach to separating the concerns of running large-scale network models on parallel computers and the user facing concerns of configuring and interacting with large-scale network models. Second, this work deals with reducing memory consumption of network models. As network models become larger, so does the amount of memory needed to simulate them. This work presents a comprehensive approach to exploiting structural duplications in network models to dramatically reduce the memory required to execute large-scale network experiments. Lastly, this work addresses the issue of validating large-scale simulations by integrating real protocols and applications into the simulation. With an emulation extension, a network simulator operating in real-time can run together with real-world distributed applications and services. As such, real-time network simulation not only alleviates the burden of developing separate models for applications in simulation, but as real systems are included in the network model, it also increases the confidence level of network simulation. This work presents a scalable and flexible framework to integrate real-world applications with real-time simulation

    Historia, evolución y perspectivas de futuro en la utilización de técnicas de simulación en la gestión portuaria: aplicaciones en el análisis de operaciones, estrategia y planificación portuaria

    Get PDF
    Programa Oficial de Doutoramento en Análise Económica e Estratexia Empresarial. 5033V0[Resumen] Las técnicas de simulación, tal y como hoy las conocemos, comenzaron a mediados del siglo XX; primero con la aparición del primer computador y el desarrollo del método Monte Carlo, y más tarde con el desarrollo del primer simulador de propósito específico conocido como GPS y desarrollado por Geoffrey Gordon en IBM y la publicación del primer texto completo dedicado a esta materia y llamado the Art of Simulation (K.D. Tocher, 1963). Estás técnicas han evolucionado de una manera extraordinaria y hoy en día están plenamente implementadas en diversos campos de actividad. Las instalaciones portuarias no han escapado de esta tendencia, especialmente las dedicadas al tráfico de contenedores. Efectivamente, las características intrínsecas de este sector económico, le hacen un candidato idóneo para la implementación de modelos de simulación con propósitos y alcances muy diversos. No existe, sin embargo y hasta lo que conocemos, un trabajo científico que compile y analice pormenorizadamente tanto la historia como la evolución de simulación en ambientes portuarios, ayudando a clasificar los mismos y determinar cómo estos pueden ayudar en el análisis económico de estas instalaciones y en la formulación de las oportunas estrategias empresariales. Este es el objetivo último de la presente tesis doctoral.[Resumo] As técnicas de simulación, tal e como hoxe as coñecemos, comezaron a mediados do século XX; primeiro coa aparición do computador e o desenvolvemento do método Monte Carlo e máis tarde co desenvolvemento do primeiro simulador de propósito específico coñecido como GPS e desenvolvido por Geoffrey Gordon en IBM e a publicación do primeiro texto completo dedicado a este tema chamado “A Arte da Simulación” (K.D. Tocher, 1963). Estas técnicas evolucionaron dun xeito extraordinario e hoxe en día están plenamente implementadas en diversos campos de actividade. As instalacións portuarias non escaparon desta tendencia, especialmente as dedicadas ao tráfico de contenedores. Efectivamente, as características intrínsecas deste sector económico, fanlle un candidato idóneo para a implementación de modelos de simulación con propósitos e alcances moi variados. Con todo, e ata o que coñecemos, non existe un traballo científico que compila e analiza de forma detallada tanto a historia como a evolución da simulación en estes ambientes portuarios, clasificando os mesmos e determinando como estes poden axudar na análise económica destas instalacións e na formulación das oportunas estratexias empresariais. Este é o último obxectivo da presente tese doutoral.[Abstract] Simulation, to the extend that we understand it nowadays, began in the middle of the 20th century; first with the appearance of the computer and the development of the Monte Carlo method, and later with the development of the first specific purpose simulator known as GPS developed by Geoffrey Gordon in IBM. This author published the first full text devoted to this subject “The Art of Simulation” in 1963. These techniques have evolved in an extraordinary way and nowadays they are fully implemented in different fields of activity. Port facilities have not escaped this trend, especially those dedicated to container traffic. Indeed, the intrinsic characteristics of this economic sector, make it a suitable candidate for the implementation of simulation with very different purposes and scope. However, to the best of our knowelegde, there is not a scientific work that compiles and analyzes in detail both, the history and the evolution of simulation in port environments, contributing to classify them and determine how they can help in the economic analysis of these facilities and in the formulation of different business strategies. This is the ultimate goal of this doctoral thesis

    Workplace values in the Japanese public sector: a constraining factor in the drive for continuous improvement

    Get PDF

    Simulating academic entrepreneurship and inter-organisational collaboration in university ecosystems, a hybrid system dynamics agent-based simulation

    Get PDF
    Universities are increasingly expected to actively contribute to socio-economic development. Academic entrepreneurship and the evolution of the entrepreneurial university within ecosystems have received increasing attention from both policymakers and academic communities over the last decades. However, most studies on universities' external engagement have focused on individual activities and single universities, hereby neglecting the feedback effects between different activities and how universities are linked through an overlap of their ecosystems. The result is an incomplete understanding of how universities interact with their ecosystem and the resulting inter- and intra-organisational dynamics. This research addresses this issue by developing a hybrid system dynamics agent-based model, which captures feedback structure and the internal decision-making of universities and companies. Both the conceptual and simulation model are based on a triangulation of the literature, interviews with representatives of Scottish universities, and secondary data for Scottish universities and UK businesses. This research makes several theoretical, methodological, and empirical contributions. From a theoretical perspective, it contributes in two distinct ways to the field of entrepreneurship by defining university ecosystems in new way that provides a basis for future research and developing a multi-modal simulation model that can be applied in tested in different contexts. The methodological contributions to the field of modelling and simulation in management science include a modelling process for hybrid simulations, new practices for modelling the size of agent populations through different designs of stocks and flows in the system dynamics module in hybrid simulations, and complex events for recognising emergent behaviour. Lastly, this research makes two empirical contributions to the field of entrepreneurship. This research shines a light on the dynamics of academic entrepreneurship and how universities can partially overcome a low research prestige to increase academic entrepreneurship. Implications for policy and practice are outlined and opportunities for future research conclude this thesis.Universities are increasingly expected to actively contribute to socio-economic development. Academic entrepreneurship and the evolution of the entrepreneurial university within ecosystems have received increasing attention from both policymakers and academic communities over the last decades. However, most studies on universities' external engagement have focused on individual activities and single universities, hereby neglecting the feedback effects between different activities and how universities are linked through an overlap of their ecosystems. The result is an incomplete understanding of how universities interact with their ecosystem and the resulting inter- and intra-organisational dynamics. This research addresses this issue by developing a hybrid system dynamics agent-based model, which captures feedback structure and the internal decision-making of universities and companies. Both the conceptual and simulation model are based on a triangulation of the literature, interviews with representatives of Scottish universities, and secondary data for Scottish universities and UK businesses. This research makes several theoretical, methodological, and empirical contributions. From a theoretical perspective, it contributes in two distinct ways to the field of entrepreneurship by defining university ecosystems in new way that provides a basis for future research and developing a multi-modal simulation model that can be applied in tested in different contexts. The methodological contributions to the field of modelling and simulation in management science include a modelling process for hybrid simulations, new practices for modelling the size of agent populations through different designs of stocks and flows in the system dynamics module in hybrid simulations, and complex events for recognising emergent behaviour. Lastly, this research makes two empirical contributions to the field of entrepreneurship. This research shines a light on the dynamics of academic entrepreneurship and how universities can partially overcome a low research prestige to increase academic entrepreneurship. Implications for policy and practice are outlined and opportunities for future research conclude this thesis
    corecore