23 research outputs found
Developing a model of evacuation after an earthquake in Lebanon
This article describes the development of an agent-based model (AMEL,
Agent-based Model for Earthquake evacuation in Lebanon) that aims at simulating
the movement of pedestrians shortly after an earthquake. The GAMA platform was
chosen to implement the model. AMEL is applied to a real case study, a district
of the city of Beirut, Lebanon, which potentially could be stricken by a M7
earthquake. The objective of the model is to reproduce real life mobility
behaviours that have been gathered through a survey in Beirut and to test
different future scenarios, which may help the local authorities to target
information campaigns.Comment: 8 pages, 11 figures, ISCRAM Vietnam Conference, November 201
Recommended from our members
A Real Time Web Based Electronic Triage, Resource Allocation and Hospital Dispatch System for Emergency Response
Disasters are characterized by large numbers of victims and required resources, overwhelming the available resources. Disaster response involves various entities like Incident Commanders, dispatch centers, emergency operations centers, area command and hospitals. An effective emergency response system should facilitate coordination between these various entities. Victim triage, emergency resource allocation and victim dispatch to hospitals form an important part of an emergency response system. In this present research effort, an emergency response system with the aforementioned components is developed.
Triage is the process of prioritizing mass casualty victims based on severity of injuries. The system presented in this thesis is a low-cost victim triage system with RFID tags that aggregate all victim information within a database. It will allow first responders\u27 movements to be tracked using GPS. A web-based real time resource allocation tool that can assist the Incident Commanders in resource allocation and transportation for multiple simultaneous incidents has been developed. This tool ensures that high priority resources at emergency sites are received in least possible time. This web-based tool also computes the patient dispatch schedule from each disaster site to each hospital. Patients are allocated to nearest hospitals with available medical facilities. This tool can also assist resource managers in emergency resource planning by computing the time taken to receive required resources from the nearest depots using Google Maps. These web-based tools complements emergency response systems by providing decision-making capabilities
An information assistant system for the prevention of tunnel vision in crisis management
In the crisis management environment, tunnel vision is a set of bias in decision makers’ cognitive process which often leads to incorrect understanding of the real crisis situation, biased perception of information, and improper decisions. The tunnel vision phenomenon is a consequence of both the challenges in the task and the natural limitation in a human being’s cognitive process. An information assistant system is proposed with the purpose of preventing tunnel vision. The system serves as a platform for monitoring the on-going crisis event. All information goes through the system before arrives at the user. The system enhances the data quality, reduces the data quantity and presents the crisis information in a manner that prevents or repairs the user’s cognitive overload. While working with such a system, the users (crisis managers) are expected to be more likely to stay aware of the actual situation, stay open minded to possibilities, and make proper decisions
Online optimization of casualty processing in major incident response: An experimental analysis
When designing an optimization model for use in mass casualty incident (MCI) response, the dynamic and uncertain nature of the problem environment poses a significant challenge. Many key problem parameters, such as the number of casualties to be processed, will typically change as the response operation progresses. Other parameters, such as the time required to complete key response tasks, must be estimated and are therefore prone to errors. In this work we extend a multi-objective combinatorial optimization model for MCI response to improve performance in dynamic and uncertain environments. The model is developed to allow for use in real time, with continuous communication between the optimization model and problem environment. A simulation of this problem environment is described, allowing for a series of computational experiments evaluating how model utility is influenced by a range of key dynamic or uncertain problem and model characteristics. It is demonstrated that the move to an online system mitigates against poor communication speed, while errors in the estimation of task duration parameters are shown to significantly reduce model utility
Geometric, Semantic, and System-Level Scene Understanding for Improved Construction and Operation of the Built Environment
Recent advances in robotics and enabling fields such as computer vision, deep learning, and low-latency data passing offer significant potential for developing efficient and low-cost solutions for improved construction and operation of the built environment. Examples of such potential solutions include the introduction of automation in environment monitoring, infrastructure inspections, asset management, and building performance analyses. In an effort to advance the fundamental computational building blocks for such applications, this dissertation explored three categories of scene understanding capabilities: 1) Localization and mapping for geometric scene understanding that enables a mobile agent (e.g., robot) to locate itself in an environment, map the geometry of the environment, and navigate through it; 2) Object recognition for semantic scene understanding that allows for automatic asset information extraction for asset tracking and resource management; 3) Distributed coupling analysis for system-level scene understanding that allows for discovery of interdependencies between different built-environment processes for system-level performance analyses and response-planning.
First, this dissertation advanced Simultaneous Localization and Mapping (SLAM) techniques for convenient and low-cost locating capabilities compared with previous work. To provide a versatile Real-Time Location System (RTLS), an occupancy grid mapping enhanced visual SLAM (vSLAM) was developed to support path planning and continuous navigation that cannot be implemented directly on vSLAM’s original feature map. The system’s localization accuracy was experimentally evaluated with a set of visual landmarks. The achieved marker position measurement accuracy ranges from 0.039m to 0.186m, proving the method’s feasibility and applicability in providing real-time localization for a wide range of applications. In addition, a Self-Adaptive Feature Transform (SAFT) was proposed to improve such an RTLS’s robustness in challenging environments. As an example implementation, the SAFT descriptor was implemented with a learning-based descriptor and integrated into a vSLAM for experimentation. The evaluation results on two public datasets proved the feasibility and effectiveness of SAFT in improving the matching performance of learning-based descriptors for locating applications.
Second, this dissertation explored vision-based 1D barcode marker extraction for automated object recognition and asset tracking that is more convenient and efficient than the traditional methods of using barcode or asset scanners. As an example application in inventory management, a 1D barcode extraction framework was designed to extract 1D barcodes from video scan of a built environment. The performance of the framework was evaluated with video scan data collected from an active logistics warehouse near Detroit Metropolitan Airport (DTW), demonstrating its applicability in automating inventory tracking and management applications.
Finally, this dissertation explored distributed coupling analysis for understanding interdependencies between processes affecting the built environment and its occupants, allowing for accurate performance and response analyses compared with previous research. In this research, a Lightweight Communications and Marshalling (LCM)-based distributed coupling analysis framework and a message wrapper were designed. This proposed framework and message wrapper were tested with analysis models from wind engineering and structural engineering, where they demonstrated the abilities to link analysis models from different domains and reveal key interdependencies between the involved built-environment processes.PHDCivil EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/155042/1/lichaox_1.pd
Distributed Simulation of Interdependencies in Community Resilience
When a disruption such as a severe natural event occurs, the interdependencies between the infrastructure systems of society can lead to cascading events that can adversely affect community resilience. Resilience is the ability of a community to withstand, adapt and recover from a disruption, typically measured in terms of loss of life, injuries and economic cost. Studying the interactions between infrastructure systems is complicated by the fact that each system is rooted in a specific field and thus requires crossing disciplinary barriers. To overcome the identified research challenge, this dissertation employs distributed computational simulation to model and investigate the interdependencies that arise during severe disasters and the post-disaster recovery process. It focuses in particular on multi-scale interdependencies and their time-dependent effects on community resilience.
A distributed simulation framework that links each discipline specific simulator using a publish-subscribe data transmission pattern is proposed. The framework’s capabilities are demonstrated through a case study of multistory buildings that suffer wind-induced progressive damage. Data transmission is achieved using the Lightweight Communications and Marshalling (LCM) libraries. Building upon the LCM platform, a group of discipline specific computational models with disparate temporal and spatial scales are linked together to investigate the time-dependent interdependencies that arise between water, gas, and electrical power systems during a series of seismic events and the corresponding recovery processes. The results show that ignoring interdependencies can adversely affect resilience assessments and adopting time-varying recovery strategies can lead to better resilience performance.
An agent-based computational model simulating benefit fraud behavior in the wake of a disaster is used to demonstrate that distributed simulation frameworks can take into account broader socio-technical interactions in resilience research. The study not only considered the effect of micro-level disaster-caused demands but also meso-level social factors on criminal tendencies. The proposed model captures the key characteristics of post-disaster benefit fraud in detail, including the dynamic nature of the criminal process. The results of parametric sensitivity analyses can be used to achieve a meaningful balance between the loss of fraudulent payments and the speed of distributing aid for improving the overall resilience performance of communities.
To provide a scalable, versatile, and user-friendly solution for natural hazards simulations, a new distributed computing tool called Simple Run-Time Infrastructure (SRTI) is employed. The high-level structure, data structure, and fundamental components of SRTI are comprehensively described. The applications of SRTI in natural hazard simulations are presented. The performance of an initial version of the SRTI is compared with the LCM. A cross-language simulation of time-dependent resilience analysis of an electric power system is conducted to show the scalability and flexibility of the improved version of the SRTI, which reduces a user’s effort for composing a complex distributed simulation and better handle time management. Lastly, the choice between different versions of SRTI and potential features to develop in the future are discussed.PHDCivil EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/155169/1/sylin_1.pd
INTEROPERABILITY FOR MODELING AND SIMULATION IN MARITIME EXTENDED FRAMEWORK
This thesis reports on the most relevant researches performed during the years of the Ph.D. at the Genova University and within the Simulation Team. The researches have been performed according to M&S well known recognized standards. The studies performed on interoperable simulation cover all the environments of the Extended Maritime Framework, namely Sea Surface, Underwater, Air, Coast & Land, Space and Cyber Space. The applications cover both the civil and defence domain. The aim is to demonstrate the potential of M&S applications for the Extended Maritime Framework, applied to innovative unmanned vehicles as well as to traditional assets, human personnel included. A variety of techniques and methodology have been fruitfully applied in the researches, ranging from interoperable simulation, discrete event simulation, stochastic simulation, artificial intelligence, decision support system and even human behaviour modelling
Disaster management and its economic implications
Das Ziel dieser Arbeit ist es, aktuelle Forschungsschwerpunkte im Bereich des
Katastrophenmanagements in der Operational Research Literatur aufzuzeigen.
Katastrophenmanagement umfasst in diesem Zusammenhang einerseits Naturkatastrophen
wie geophysikalische und hydro-meteorologische Katastrophen, technologische Katastrophen
wie industrielle Unfälle, Transportunfälle und sonstige Unfälle, und andererseits die
verschiedenen Formen des Terrorismus, allgemeinen Terrorismus sowie Bioterrorismus. Da
die Anzahl und das AusmaĂź von Katastrophen immer weiter zunehmen ist auch eine immer
größere Notwendigkeit für die Entwicklung, den Einsatz und die wirtschaftliche Beurteilung
der jeweiligen Strategien gegeben.
Der erste Teil dieser Arbeit gibt einen Ăśberblick ĂĽber die Literatur im Bereich des
Katastrophenmanagements und umfasst Simulation, Katastrophenmanagement in
Krankenhäusern und die Rolle von Versicherungen im Katastrophenmanagementprozess. Im
zweiten Teil wird eine Taxonomie entwickelt, deren Kategorien auf den Modellen und
Ergebnissen der Literatur beruhen. Einerseits werden allgemeine Modelleigenschaften wie die
Ebene im Katastrophenmanagementprozess, der Modelltyp und die Anwendungsgebiete der
Modelle untersucht. Andererseits stellen die Art der Intervention und die Anwendbarkeit fĂĽr
die unterschiedlichen Katastrophenklassen weitere Kategorien der Taxonomie dar. Es wurden
90 Artikel, die beispielhaft fĂĽr die Forschungsrichtungen im Bereich des
Katastrophenmanagements der letzten 25 Jahre stehen, ausgewählt, und entsprechend den
jeweiligen Kategorien der Taxonomie zugeordnet.
Das Hauptaugenmerk der Taxonomie liegt auf der wirtschaftlichen Analyse, die
wirksamkeitsbezogene, ressourcenbezogene und kostenbezogene Parameter umfasst. Es wird
gezeigt ob und welche wirtschaftliche Analyse wie beispielsweise die Kosten-Nutzwert-
Analyse, die Kosten-Wirksamkeits-Analyse und die Kosten-Nutzen-Analyse angewendet
wird um die in den Artikeln beschriebenen Interventionen zu evaluieren.
Es wird gezeigt, dass erhebliche Verbesserungen fĂĽr die verschiedenen Katastrophentypen
und in den verschiedenen Situationen erzielt werden können. Eingeschränkte
Datenverfügbarkeit schränkt in vielen Fällen die Einsetzbarkeit der Modelle in realen
Situationen ein. Im Allgemeinen ist erkennbar, dass Kooperation und Koordination zwischen
den beteiligten Einheiten ausschlaggebend fĂĽr den zeitgerechten und effizienten Einsatz der knappen Ressourcen sind. Oftmals erzielt der gemeinsame Einsatz mehrerer MaĂźnahme ein
deutlich besseres Ergebnis als der Einsatz von lediglich einem einzigen Instrument.
Die Taxonomie unterstreicht dass trotz der groĂźen FĂĽlle an Literatur im Bereich des
Katastrophenmanagements nur wenige Autoren auf die Kosten-Nutzwert-Analyse, die
Kosten-Wirksamkeits-Analyse und die Kosten-Nutzen-Analyse als Hilfsmittel zur
wirtschaftlichen Analyse zurĂĽckgreifen. In Zukunft, um Interventionen erfolgreich evaluieren
zu können oder die beste aus mehreren Interventionen bestimmen zu können wird es immer
wichtiger werden, diese Art von wirtschaftlichen Analysen anzuwenden.This thesis intends to demonstrate current research directions in the field of disaster management in the Operational Research literature. Disaster management in this context comprises the management of natural, such as geophysical and hydro-meteorological, and technological disasters, such as industrial accidents, transportation accidents, and
miscellaneous accidents, as well as the management of the different terrorism forms, general
terrorism and bioterrorism. As the occurrence of disasters is getting more and more frequent
and the accumulated loss of these events is getting higher and higher, there is a strong need
for the development, implication and economic evaluation of strategies to counter these
disasters.
In the first part of the thesis, a general overview of the literature is given, including a focus on
simulation, disaster management in hospitals, and the role of insurances in the disaster
management process. The second part encompasses the taxonomy which focuses on models
and outcomes presented in the literature. As a result of the review of the literature, appropriate
categories for the disaster management taxonomy are derived. On the one hand, an overview
of general model features, i.e., the level of disaster management, model type and methods of
application is given. On the other hand, the type of intervention used and the practicability for
different disaster types are discussed. 90 papers, illustrative main examples of the research
directions of the last 25 years, were selected for deeper investigation and classified according
to the main criteria analyzed in the articles.
The main focus of the taxonomy lies on the economic analysis, which encompasses
effectiveness-related, resource-related, and cost-related parameters and shows the type of
economic analysis used in the literature. We analyze whether economic analysis, i.e., costutility,
cost-effectiveness, and cost-benefit are used to investigate different interventions and
what type of analysis has been chosen by the authors.
Policy implications and results show that considerable improvements can be achieved for
different disastrous events and in different situations. Limited data availability constrains the
outcomes of the models and their applicability to real-world situations. In general,
cooperation and coordination of the entities involved are crucial to guarantee timely and efficient assignment of scarce resources. Furthermore, different authors confirm that a
combination of various measures often achieves a better outcome than if tools are used
autonomously.
The taxonomy has underlined that although there exists a vast disaster management literature
dealing with various problems related to mitigation, preparedness, response and recovery
from disasters, there are only a few authors evaluating the actions taken through economic
analyses such cost-utility, cost-effectiveness, or cost-benefit analysis.
In the future, to be able to evaluate interventions, or to figure out the most effective
intervention among several interventions, it is crucial to stronger rely on the abovementioned
economic analyses