16 research outputs found

    Knowledge as a Service Framework for Disaster Data Management

    Get PDF
    Each year, a number of natural disasters strike across the globe, killing hundreds and causing billions of dollars in property and infrastructure damage. Minimizing the impact of disasters is imperative in today’s society. As the capabilities of software and hardware evolve, so does the role of information and communication technology in disaster mitigation, preparation, response, and recovery. A large quantity of disaster-related data is available, including response plans, records of previous incidents, simulation data, social media data, and Web sites. However, current data management solutions offer few or no integration capabilities. Moreover, recent advances in cloud computing, big data, and NoSQL open the door for new solutions in disaster data management. In this paper, a Knowledge as a Service (KaaS) framework is proposed for disaster cloud data management (Disaster-CDM), with the objectives of 1) storing large amounts of disaster-related data from diverse sources, 2) facilitating search, and 3) supporting their interoperability and integration. Data are stored in a cloud environment using a combination of relational and NoSQL databases. The case study presented in this paper illustrates the use of Disaster-CDM on an example of simulation models

    empathi: An ontology for Emergency Managing and Planning about Hazard Crisis

    Full text link
    In the domain of emergency management during hazard crises, having sufficient situational awareness information is critical. It requires capturing and integrating information from sources such as satellite images, local sensors and social media content generated by local people. A bold obstacle to capturing, representing and integrating such heterogeneous and diverse information is lack of a proper ontology which properly conceptualizes this domain, aggregates and unifies datasets. Thus, in this paper, we introduce empathi ontology which conceptualizes the core concepts concerning with the domain of emergency managing and planning of hazard crises. Although empathi has a coarse-grained view, it considers the necessary concepts and relations being essential in this domain. This ontology is available at https://w3id.org/empathi/

    A TASK-ORIENTED DISASTER INFORMATION CORRELATION METHOD

    Get PDF

    Collaborative knowledge as a service applied to the disaster management domain

    Get PDF
    Cloud computing offers services which promise to meet continuously increasing computing demands by using a large number of networked resources. However, data heterogeneity remains a major hurdle for data interoperability and data integration. In this context, a Knowledge as a Service (KaaS) approach has been proposed with the aim of generating knowledge from heterogeneous data and making it available as a service. In this paper, a Collaborative Knowledge as a Service (CKaaS) architecture is proposed, with the objective of satisfying consumer knowledge needs by integrating disparate cloud knowledge through collaboration among distributed KaaS entities. The NIST cloud computing reference architecture is extended by adding a KaaS layer that integrates diverse sources of data stored in a cloud environment. CKaaS implementation is domain-specific; therefore, this paper presents its application to the disaster management domain. A use case demonstrates collaboration of knowledge providers and shows how CKaaS operates with simulation models

    MiR-EO: Middleware Reflexivo para la Emergencia Ontológica en Ambientes Inteligentes

    Get PDF
    In a Smart Environment (AmI), the devices that participate must exchange knowledge permanently, for which they must understand and manage a common language. The ontologies in an AmI are an ideal tool for this, making possible the communication between the intelligent objects that are part of the environment. These ontologies must be distributed, heterogeneous and dynamic, since they must adapt to the changes, needs and services of the AmI. This article proposes the implementation of a middleware that allows the ontological emergence, to manage all the knowledge that can be generated in an AmI. This middleware, called MiR-EO, is implemented as a reflective middleware, which manages its own ontological framework, made up of meta-ontologies that model the elements that must contain the ontologies of an AmI, andenables the ontological emergence process.  En un Ambiente Inteligente (AmI), los dispositivos que participan deben intercambiar conocimiento permanentemente, para lo cual deben entenderse y manejar un lenguaje común, para el logro de la interoperabilidad semántica. Las ontologías en un AmI constituyen una herramienta ideal para ello, posibilitando la comunicación entre los objetos inteligentes que forman parte del ambiente. Estas ontologías deben ser distribuidas, heterogéneas y dinámicas ya que deben adaptarse a los cambios, necesidades y servicios del AmI. Este artículo propone la implementación de un middleware que permite la emergencia ontológica, con el fin de gestionar todo el conocimiento que se puede generar en un AmI. El middleware, llamado MiR-EO, se implementa como un middleware reflexivo, que maneja su propio marco ontológico, conformado por meta-ontologías que modelan los elementos que deben contener las ontologías de un AmI, y posibilitan el proceso de emergencia ontológica. &nbsp

    Disaster Data Management in Cloud Environments

    Get PDF
    Facilitating decision-making in a vital discipline such as disaster management requires information gathering, sharing, and integration on a global scale and across governments, industries, communities, and academia. A large quantity of immensely heterogeneous disaster-related data is available; however, current data management solutions offer few or no integration capabilities and limited potential for collaboration. Moreover, recent advances in cloud computing, Big Data, and NoSQL have opened the door for new solutions in disaster data management. In this thesis, a Knowledge as a Service (KaaS) framework is proposed for disaster cloud data management (Disaster-CDM) with the objectives of 1) facilitating information gathering and sharing, 2) storing large amounts of disaster-related data from diverse sources, and 3) facilitating search and supporting interoperability and integration. Data are stored in a cloud environment taking advantage of NoSQL data stores. The proposed framework is generic, but this thesis focuses on the disaster management domain and data formats commonly present in that domain, i.e., file-style formats such as PDF, text, MS Office files, and images. The framework component responsible for addressing simulation models is SimOnto. SimOnto, as proposed in this work, transforms domain simulation models into an ontology-based representation with the goal of facilitating integration with other data sources, supporting simulation model querying, and enabling rule and constraint validation. Two case studies presented in this thesis illustrate the use of Disaster-CDM on the data collected during the Disaster Response Network Enabled Platform (DR-NEP) project. The first case study demonstrates Disaster-CDM integration capabilities by full-text search and querying services. In contrast to direct full-text search, Disaster-CDM full-text search also includes simulation model files as well as text contained in image files. Moreover, Disaster-CDM provides querying capabilities and this case study demonstrates how file-style data can be queried by taking advantage of a NoSQL document data store. The second case study focuses on simulation models and uses SimOnto to transform proprietary simulation models into ontology-based models which are then stored in a graph database. This case study demonstrates Disaster-CDM benefits by showing how simulation models can be queried and how model compliance with rules and constraints can be validated

    A TASK-DRIVEN DISASTER DATA LINK APPROACH

    Get PDF

    Development of a National-Scale Big Data Analytics Pipeline to Study the Potential Impacts of Flooding on Critical Infrastructures and Communities

    Get PDF
    With the rapid development of the Internet of Things (IoT) and Big data infrastructure, crowdsourcing techniques have emerged to facilitate data processing and problem solving particularly for flood emergences purposes. A Flood Analytics Information System (FAIS) has been developed as a Python Web application to gather Big data from multiple servers and analyze flooding impacts during historical and real-time events. The application is smartly designed to integrate crowd intelligence, machine learning (ML), and natural language processing of tweets to provide flood warning with the aim to improve situational awareness for flood risk management and decision making. FAIS allows the user to submit search request from the United States Geological Survey (USGS) as well as Twitter through a series of queries, which is used to modify request URL sent to data sources. This national scale prototype combines flood peak rates and river level information with geotagged tweets to identify a dynamic set of at-risk locations to flooding. The list of prioritized areas can be updated every 15 minutes as the crowdsourced data and environmental information and condition change. In addition, FAIS uses Google Vision API (application programming interface) and image processing algorithms to detect objects (flood, road, vehicle, river, etc.) in time-lapse digital images and build valuable metadata into image catalog. The application performs Flood Frequency Analysis (FFA) and computes design flow values corresponding to specific return periods that can help engineers in designing safe structures and in protection against economic losses due to maintenance of civil infrastructure. FAIS is successfully tested in real-time during Hurricane Dorian flooding event across the Carolinas where the storm made extensive damage and disruption to critical infrastructure and the environment. The prototype is also verified during historical events such as Hurricanes Matthew and Florence flooding for the Lower PeeDee Basin in the Carolinas

    Disaster recovery in cloud computing systems: an overview

    Get PDF
    With the rapid growth of internet technologies, large-scale online services, such as data backup and data recovery are increasingly available. Since these large-scale online services require substantial networking, processing, and storage capacities, it has become a considerable challenge to design equally large-scale computing infrastructures that support these services cost-effectively. In response to this rising demand, cloud computing has been refined during the past decade and turned into a lucrative business for organizations that own large datacenters and offer their computing resources. Undoubtedly cloud computing provides tremendous benefits for data storage backup and data accessibility at a reasonable cost. This paper aims at surveying and analyzing the previous works proposed for disaster recovery in cloud computing. The discussion concentrates on investigating the positive aspects and the limitations of each proposal. Also examined are discussed the current challenges in handling data recovery in the cloud context and the impact of data backup plan on maintaining the data in the event of natural disasters. A summary of the leading research work is provided outlining their weaknesses and limitations in the area of disaster recovery in the cloud computing environment. An in-depth discussion of the current and future trends research in the area of disaster recovery in cloud computing is also offered. Several work research directions that ought to be explored are pointed out as well, which may help researchers to discover and further investigate those problems related to disaster recovery in the cloud environment that have remained unresolved
    corecore