13,535 research outputs found

    Performance Measures to Assess Resiliency and Efficiency of Transit Systems

    Get PDF
    Transit agencies are interested in assessing the short-, mid-, and long-term performance of infrastructure with the objective of enhancing resiliency and efficiency. This report addresses three distinct aspects of New Jersey’s Transit System: 1) resiliency of bridge infrastructure, 2) resiliency of public transit systems, and 3) efficiency of transit systems with an emphasis on paratransit service. This project proposed a conceptual framework to assess the performance and resiliency for bridge structures in a transit network before and after disasters utilizing structural health monitoring (SHM), finite element (FE) modeling and remote sensing using Interferometric Synthetic Aperture Radar (InSAR). The public transit systems in NY/NJ were analyzed based on their vulnerability, resiliency, and efficiency in recovery following a major natural disaster

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio

    "Last-Mile" preparation for a potential disaster

    Get PDF
    Extreme natural events, like e.g. tsunamis or earthquakes, regularly lead to catastrophes with dramatic consequences. In recent years natural disasters caused hundreds of thousands of deaths, destruction of infrastructure, disruption of economic activity and loss of billions of dollars worth of property and thus revealed considerable deficits hindering their effective management: Needs for stakeholders, decision-makers as well as for persons concerned include systematic risk identification and evaluation, a way to assess countermeasures, awareness raising and decision support systems to be employed before, during and after crisis situations. The overall goal of this study focuses on interdisciplinary integration of various scientific disciplines to contribute to a tsunami early warning information system. In comparison to most studies our focus is on high-end geometric and thematic analysis to meet the requirements of small-scale, heterogeneous and complex coastal urban systems. Data, methods and results from engineering, remote sensing and social sciences are interlinked and provide comprehensive information for disaster risk assessment, management and reduction. In detail, we combine inundation modeling, urban morphology analysis, population assessment, socio-economic analysis of the population and evacuation modeling. The interdisciplinary results eventually lead to recommendations for mitigation strategies in the fields of spatial planning or coping capacity

    Enhancing OpenStreetMap for the Assessment of Critical Road Infrastructure in a Disaster Context

    Get PDF
    Die Häufigkeit von Naturkatastrophen nimmt weltweit zu, was zu immensen Schäden an kritischer Straßeninfrastruktur und deren Funktionalität führen kann. Daher ist es von entscheidender Bedeutung, die Funktionalität kritischer Straßeninfrastruktur vor, während und nach einer Katastrophe zu beurteilen. Dazu werden globale Straßendaten benötigt, die für die Routenplanung nutzbar sind. OpenStreetMap (OSM) stellt globale Straßennetzdaten zur Verfügung, die kostenlos und frei zugänglich sind. Allerdings ist die Verwendung der OSM Straßendaten für Routenplanungsanwendungen oft eine Herausforderung. Das übergeordnete Ziel dieser Arbeit ist die Entwicklung eines generischen, mehrskaligen Konzepts zur Analyse kritischer Straßeninfrastrukturen im Kontext von Naturgefahren unter Verwendung von OSM Daten. Dafür werden zwei aufeinander folgende Forschungsziele aufgestellt: (i) die Verbesserung der Routingfähigkeit von OSM Daten und (ii) die Bewertung kritischer Straßeninfrastruktur im Kontext von Naturgefahren. Daraus resultiert die Gliederung dieser Arbeit in zwei Hauptteile, die jeweils den Forschungszielen entsprechen. Im ersten Teil dieser Arbeit wird die Nutzbarkeit von OSM Daten für Routing Anwendungen verbessert. Zunächst wird dafür die Qualität des OSM Straßennetzwerks im Detail analysiert. Dabei werden zwei große Herausforderungen im Bereich der Anwendbarkeit von OSM Daten für die Routenplanung identifiziert: fehlende Geschwindigkeitsangaben und Fehler in der Straßenklassifizierung. Um die erste Herausforderung zu bewältigen, wird ein FuzzyFramework zur Geschwindigkeitsschätzung (Fuzzy-FSE) entwickelt, welches eine Fuzzy Regelung zur Schätzung der Durchschnittsgeschwindigkeit einsetzt. Diese Fuzzy Regelung basiert auf den Parametern Straßenklasse, Straßenneigung, Straßenoberfläche und Straßenlänge einsetzt. Das Fuzzy-FSE besteht aus zwei Teilen: einer Regel- und Wissensbasis, die über die Zugehörigkeitsfunktionen für den Ausgangsparameter Geschwindigkeit entscheidet, und mehrere Fuzzy-Regelsysteme, welche die resultierende Durchschnittsgeschwindigkeit berechnen. Die Ergebnisse zeigen, dass das Fuzzy-FSE auch bei ausschließlicher Verwendung von OSM Daten eine bessere Leistung erbringt als bestehende Methoden. Die Herausforderung der fehlerhaften Straßenklassifizierung wird durch die Entwicklung eines neuartigen Ansatzes zur Erkennung von Klassifizierungfehlern in OSM angegangen. Dabei wird sowohl nach nicht verbundenen Netzwerkteilen als auch nach Lücken im Straßennetzwerk gesucht. Verschiedene Parameter werden in einem Bewertungssystem kombiniert, um eine Fehlerwahrscheinlichkeit zu erhalten. Auf Basis der Fehlerwahrscheinlichkeit kann ein menschlicher Nutzer diese Fehler überprüfen und korrigieren. Die Ergebnisse deuten einerseits darauf hin, dass an Lücken mehr Klassifizierungsfehler gefunden werden als an nicht verbundenen Netzwerkteilen. Andererseits zeigen sie, dass das entwickelte Bewertungssystem bei einer benutzergesteuerten Suche nach Lücken zu einem schnellen Aufdecken von Klassifizierungsfehlern verwendet werden kann. Aus dem ersten Teil dieser Arbeit ergibt sich somit ein erweiterter OSM Datensatz mit verbesserter Routingfähigkeit. Im zweiten Teil dieser Arbeit werden die erweiterten OSM Daten zur Bewertung der kritischen Straßeninfrastruktur im Katastrophenkontext verwendet. Dazu wird der zweite Teil des generischen, mehrskaligen Konzepts entwickelt, das aus mehreren, miteinander verbundenen Modulen besteht. Ein Modul implementiert zwei Erreichbarkeitsindizes, welche verschiedene Aspekte der Erreichbarkeit im Straßennetzwerk hervorheben. In einem weiteren Modul wird ein grundlegendes Modell der Verkehrsnachfrage entwickelt, welches den täglichen interstädtischen Verkehr ausschließlich auf der Grundlage von OSM Daten schätzt. Ein drittes Modul verwendet die oben beschriebenen Module zur Schätzung verschiedener Arten von Auswirkungen von Naturkatastrophen auf das Straßennetzwerk. Schließlich wird in einem vierten Modul die Vulnerabilität des Straßennetzes gegenüber weiteren Schäden bei Langzeitkatastrophen analysiert. Das generische Konzept mit allen Modulen wird exemplarisch in zwei verschiedenen Regionen für zwei Waldbrandszenarien angewendet. Die Ergebnisse der Fallstudien zeigen, dass das Konzept ein wertvolles, flexibles und global anwendbares Instrument für Regionalplaner und Katastrophenmanagement darstellt, das länder- bzw. regionenspezifische Anpassungen ermöglicht und gleichzeitig wenig Daten benötigt

    Critical Infrastructure Protection Metrics and Tools Papers and Presentations

    Get PDF
    Contents: Dr. Hilda Blanco: Prioritizing Assets in Critical Infrastructure Systems; Christine Poptanich: Strategic Risk Analysis; Geoffrey S. French/Jin Kim: Threat-Based Approach to Risk Case Study: Strategic Homeland Infrastructure Risk Assessment (SHIRA); William L. McGill: Techniques for Adversary Threat Probability Assessment; Michael R. Powers: The Mathematics of Terrorism Risk Stefan Pickl: SOA Approach to the IT-based Protection of CIP; Richard John: Probabilistic Project Management for a Terrorist Planning a Dirty Bomb Attack on a Major US Port; LCDR Brady Downs: Maritime Security Risk Analysis Model (MSRAM); Chel Stromgren: Terrorism Risk Assessment and Management (TRAM); Steve Lieberman: Convergence of CIP and COOP in Banking and Finance; Harry Mayer: Assessing the Healthcare and Public Health Sector with Model Based Risk Analysis; Robert Powell: How Much and On What? Defending and Deterring Strategic Attackers; Ted G. Lewis: Why Do Networks Cascade

    Weaving Equity into Infrastructure Resilience Research and Practice: A Decadal Review and Future Directions

    Full text link
    After about a decade of research in this domain, what is missing is a systematic overview of the research agenda across different infrastructures and hazards. It is now imperative to evaluate the current progress and gaps. This paper presents a systematic review of equity literature on disrupted infrastructure during a natural hazard event. Following a systematic review protocol, we collected, screened, and evaluated almost 3,000 studies. Our analysis focuses on the intersection within the dimensions of the eight-dimensional assessment framework that distinguishes focus of the study, methodological approaches, and equity dimensions (distributional-demographic, distributional-spatial, procedural, and capacity equity). To conceptualize the intersection of the different dimensions of equity, we refer to pathways, which identify how equity is constructed, analyzed, and used. Significant findings show that (1) the interest in equity in infrastructure resilience has exponentially increased, (2) the majority of studies are in the US and by extension in the global north, (3) most data collection use descriptive and open-data and none of the international studies use location-intelligence data. The most prominent equity conceptualization is distributional equity, such as the disproportionate impacts to vulnerable populations and spaces. The most common pathways to study equity connect distributional equity to the infrastructure's power, water, and transportation in response to flooding and hurricane storms. Other equity concepts or pathways, such as connections of equity to decision-making and building household capacity, remain understudied. Future research directions include quantifying the social costs of infrastructure disruptions and better integration of equity into resilience decision-making.Comment: 37 pages, 11 figures, 2 table

    Integration of graphical, physics-based, and machine learning methods for assessment of impact and recovery of the built environment from wind hazards

    Get PDF
    2019 Summer.Includes bibliographical references.The interaction between a natural hazard and a community has the potential to result in a natural disaster with substantial socio-economic losses. In order to minimize disaster impacts, researchers have been improving building codes and exploring further concepts of community resilience. Community resilience refers to a community's ability to absorb a hazard (minimize impacts) and "bounce back" afterwards (quick recovery time). Therefore, the two main components in modeling resilience are: the initial impact and subsequent recovery time. With respect to a community's building stock, this entails the building damage state sustained and how long it takes to repair and reoccupy that building. In modeling these concepts, probabilistic and physics-based methods have been the traditional approach. With advancements in artificial intelligence and machine learning, as well as data availability, it may be possible to model impact and recovery differently. Most current methods are highly constrained by their topic area, for example a damage state focuses on structural loading and resistance, while social vulnerability independently focus on certain social demographics. These models currently perform independently and are then aggregated together, but with the complex connectivity available through machine learning, structural and social characteristics may be combined simultaneously in one network model. The popularity of machine learning predictive modeling across multiple different applications has risen due to the benefit of modeling complex networks and perhaps identifying critical variables that were previously unknown, or the mechanism behind how these variables interacted within the predictive problem being modeled. The research presented herein outlines a method of using artificial neural networks to model building damage and recovery times. The incorporation of graph theory to analyze the resulting models also provides insight into the "black box" of artificial intelligence and the interaction of socio-technical parameters within the concept of community resilience. The subsequent neural network models are then verified through hindcasting the 2011 Joplin tornado for individual building damage and the time it took to repair and reoccupy each building. The results of this research show viability for using these methods to model damage, but more research work may be needed to model recovery at the same level of accuracy as damage. It is therefore recommended that artificial neural networks be primarily used for problems where the variables are well known but their interactions are not as easily understood or modeled. The graphical analysis also reveals an importance of social parameters across all points in the resilience process, while the structural components remain mostly important in determining the initial impact. Final importance factors are determined for each of the variables evaluated herein. It is suggested moving forward, that modeling approaches consider integrating how a community interacts with its infrastructure, since the human components are what make a natural hazard a disaster, and tracing artificial neural network connections may provide a starting point for such integration into current traditional modeling approaches
    corecore