5,901 research outputs found

    Workflow-based Collaborative Decision Support for Flood Management Systems

    Get PDF
    AbstractSimulation-based decision making is the one of prospective applications of computational sciences which is central to advances in many scientific fields. The complexity and interdisciplinarity of scientific problems lead to the new technologies of simulation software implementation based on cloud computing, workflow tools and close interaction between experts and decision-makers. The important challenge in this field is to combine simulation scenarios, expert decisions and distributed environment to solve the complex interdisciplinary problems. In this paper, we describe a way to organize the collaborative decision support on the basis of e-Science platform CLAVIRE with the emphasis on urgency. A case study on decision making is the gates maneuvering for the flood prevention in Saint-Petersburg as a part of flood management system

    Distributed simulation of city inundation by coupled surface and subsurface porous flow for urban flood decision support system

    Get PDF
    We present a decision support system for flood early warning and disaster management. It includes the models for data-driven meteorological predictions, for simulation of atmospheric pressure, wind, long sea waves and seiches; a module for optimization of flood barrier gates operation; models for stability assessment of levees and embankments, for simulation of city inundation dynamics and citizens evacuation scenarios. The novelty of this paper is a coupled distributed simulation of surface and subsurface flows that can predict inundation of low-lying inland zones far from the submerged waterfront areas, as observed in St. Petersburg city during the floods. All the models are wrapped as software services in the CLAVIRE platform for urgent computing, which provides workflow management and resource orchestration.Comment: Pre-print submitted to the 2013 International Conference on Computational Scienc

    Models of everywhere revisited: a technological perspective

    Get PDF
    The concept ‘models of everywhere’ was first introduced in the mid 2000s as a means of reasoning about the environmental science of a place, changing the nature of the underlying modelling process, from one in which general model structures are used to one in which modelling becomes a learning process about specific places, in particular capturing the idiosyncrasies of that place. At one level, this is a straightforward concept, but at another it is a rich multi-dimensional conceptual framework involving the following key dimensions: models of everywhere, models of everything and models at all times, being constantly re-evaluated against the most current evidence. This is a compelling approach with the potential to deal with epistemic uncertainties and nonlinearities. However, the approach has, as yet, not been fully utilised or explored. This paper examines the concept of models of everywhere in the light of recent advances in technology. The paper argues that, when first proposed, technology was a limiting factor but now, with advances in areas such as Internet of Things, cloud computing and data analytics, many of the barriers have been alleviated. Consequently, it is timely to look again at the concept of models of everywhere in practical conditions as part of a trans-disciplinary effort to tackle the remaining research questions. The paper concludes by identifying the key elements of a research agenda that should underpin such experimentation and deployment

    Knowledge-based Expressive Technologies within Cloud Computing Environments

    Full text link
    Presented paper describes the development of comprehensive approach for knowledge processing within e-Sceince tasks. Considering the task solving within a simulation-driven approach a set of knowledge-based procedures for task definition and composite application processing can be identified. This procedures could be supported by the use of domain-specific knowledge being formalized and used for automation purpose. Within this work the developed conceptual and technological knowledge-based toolbox for complex multidisciplinary task solv-ing support is proposed. Using CLAVIRE cloud computing environment as a core platform a set of interconnected expressive technologies were developed.Comment: Proceedings of the 8th International Conference on Intelligent Systems and Knowledge Engineering (ISKE2013). 201

    Seafloor characterization using airborne hyperspectral co-registration procedures independent from attitude and positioning sensors

    Get PDF
    The advance of remote-sensing technology and data-storage capabilities has progressed in the last decade to commercial multi-sensor data collection. There is a constant need to characterize, quantify and monitor the coastal areas for habitat research and coastal management. In this paper, we present work on seafloor characterization that uses hyperspectral imagery (HSI). The HSI data allows the operator to extend seafloor characterization from multibeam backscatter towards land and thus creates a seamless ocean-to-land characterization of the littoral zone

    Collaborative Environment for Grid-based Flood Prediction

    Get PDF
    This paper presents the design, architecture and main implementation features of the flood prediction application of the Task 1.2 of the EU IST CROSSGRID. The paper begins with the description of the virtual organization of hydrometeorological experts, users, data providers and customers supported by the application. Then the architecture of the application is described, followed by used simulation models and modules of the collaborative environment. The paper ends with vision of future development of the application

    Rapid configurational analysis using OSM data: towards the use of Space Syntax to orient post-disaster decision making

    Get PDF
    This paper addresses the problem of the growing exposure of contemporary cities to natural hazards by discussing the theoretical, methodological and practical aspects of using the configurational approach as a framework to perform a variety of spatial analyses to better orient disaster management. It claims that enabling a quick assessment of the evolving spatial functioning of the urban grid would effectively contribute to support strategic decision-making and to make post-disaster planning decisions more explicit among stakeholders, thus boosting wider understanding and participation among the public. The paper starts with a brief review of some relevant work done by the research community to date, which highlights emergent opportunities for urban morphology studies and Space Syntax theory to trigger effective innovations in disaster management practice. Next, the paper proposes to adopt a fit-for-purpose analysis approach with the aim to achieve a higher procedural flexibility in the analysis workflow. This issue is treated with a special focus on the necessities of relief organisations which need to integrate and overlap numerous layers of information and consider the feasibility of the analysis by evaluating time and costs. The proposal considers the economy of the construction of the map to be fundamental for ensuring the feasibility of a quantitative spatial assessment in data scarce contexts such as cities affected by disasters. Moreover, it recognises that the unicity of the map is likely to enable a better communication among different stakeholders following a BIM-oriented model of cooperation, while allowing a faster response in multi-hazards scenarios. Consequently, the proposal challenges the idea of the existence of a uniquely correct way to translate reality into a model, but rather suggests using a set of simplification techniques, such as filtering, generalisation and re-modelling, on a single crowdsourced map of the urban street network to generate suitably customised graphs for subsequent analysis. This brings together two themes: the first concerns the modelling activity per se and how certain technicalities that seem minor facts can influence the final analysis output to a greater extent; the second regards the crowdsourcing of spatial data and the challenges that the use of collaborative datasets poses to the modelling tasks. In line with the most recent research trends, this paper suggests exploiting the readiness of the Open Street Map (OSM) geo-dataset and the improving computational capacities of open GIS tools such as QGIS, which has recently achieved a wider acceptance worldwide. To further speed up the analysis and increase the likeness of the configurational analysis method to be successfully deployed by a larger pool of professionals it also proposes to make use of a state-of-the-art Python library named OSMnx. In the end, the consequences of using Volunteered Geographic Information (VGI), open source GIS platforms and Python scripting to perform the analysis are illustrated in a set of suitable case studies

    OpenKnowledge at work: exploring centralized and decentralized information gathering in emergency contexts

    Get PDF
    Real-world experience teaches us that to manage emergencies, efficient crisis response coordination is crucial; ICT infrastructures are effective in supporting the people involved in such contexts, by supporting effective ways of interaction. They also should provide innovative means of communication and information management. At present, centralized architectures are mostly used for this purpose; however, alternative infrastructures based on the use of distributed information sources, are currently being explored, studied and analyzed. This paper aims at investigating the capability of a novel approach (developed within the European project OpenKnowledge1) to support centralized as well as decentralized architectures for information gathering. For this purpose we developed an agent-based e-Response simulation environment fully integrated with the OpenKnowledge infrastructure and through which existing emergency plans are modelled and simulated. Preliminary results show the OpenKnowledge capability of supporting the two afore-mentioned architectures and, under ideal assumptions, a comparable performance in both cases
    • …
    corecore