153,406 research outputs found

    An object-relational prototype of a GIS-based disaster database

    Get PDF
    Natural disasters cause billions of dollars of property and infrastructure damage, unexpected disruption to socio-economic activities and tragic loss of human lives each year. The importance of collecting and maintaining detailed and accurate records on disastrous events for an effective risk assessment and disaster mitigation has been widely recognised. Considerable efforts have been directed towards the establishment of databases on historic disasters but many disaster databases built are primarily a set of lists of historical disaster events. Disaster phenomena vary dramatically with both space and time. It is therefore important to integrate spatial-temporal dimensions of disaster events in a disaster database to support efficient and interactive querying and reporting operations. It is also important to make such a database readily accessible by a variety of users from government agencies, non-government organisations, research institutes and local communities, to enable effective and efficient emergency response, impact and risk assessment, and mitigation planning. This thesis presents a study that investigates effective and efficient geographical information system (GIS) based approaches to the representation, organisation and access of disaster information - including logical data models for representing disastrous events, the object-relational approach to database implementation, and internet-based user-interfaces for database queries and report generation. Key aspects of a disaster event, including the spatial-temporal dimensions of the hazard and its impacts, are considered in the development of data models and database implementation in order to support user-friendly querying and reporting operations. The technological strengths of GIS, database management systems, and Internet-related toolboxes are leveraged for developing a prototype of a GIS-based, object-relational disaster database with an Internet-based user interface that supports multi-mode (including map-based) database queries and flexible facilities for report generation

    The MESSAGEix Integrated Assessment Model and the ix modeling platform (ixmp)

    Get PDF
    The MESSAGE Integrated Assessment Model (IAM) developed by IIASA has been a central tool of energy-environment-economy systems analysis in the global scientific and policy arena. It played a major role in the Assessment Reports of the Intergovernmental Panel on Climate Change (IPCC); it provided marker scenarios of the Representative Concentration Pathways (RCPs) and the Shared Socio-Economic Pathways (SSPs); and it underpinned the analysis of the Global Energy Assessment (GEA). Alas, to provide relevant analysis for current and future challenges, numerical models of human and earth systems need to support higher spatial and temporal resolution, facilitate integration of data sources and methodologies across disciplines, and become open and transparent regarding the underlying data, methods, and the scientific workflow. In this manuscript, we present the building blocks of a new framework for an integrated assessment modeling platform; the \ecosystem" comprises: i) an open-source GAMS implementation of the MESSAGE energy++ system model integrated with the MACRO economic model; ii) a Java/database backend for version-controlled data management, iii) interfaces for the scientific programming languages Python & R for efficient input data and results processing workflows; and iv) a web-browser-based user interface for model/scenario management and intuitive \drag-and-drop" visualization of results. The framework aims to facilitate the highest level of openness for scientific analysis, bridging the need for transparency with efficient data processing and powerful numerical solvers. The platform is geared towards easy integration of data sources and models across disciplines, spatial scales and temporal disaggregation levels. All tools apply best-practice in collaborative software development, and comprehensive documentation of all building blocks and scripts is generated directly from the GAMS equations and the Java/Python/R source code

    Generic unified modelling process for developing semantically rich, dynamic and temporal models

    Get PDF
    Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models

    Complex Actions for Event Processing

    Get PDF
    Automatic reactions triggered by complex events have been deployed with great success in particular domains, among others, in algorithmic trading, the automatic reaction to realtime analysis of marked data. However, to date, reactions in complex event processing systems are often still limited to mere modifications of internal databases or are realized by means similar to remote procedure calls. In this paper, we argue that expressive complex actions with support for composite work ows and integration of so called external actions are desirable for a wide range of real-world applications among other emergency management. This article investigates the particularities of external actions needed in emergency management, which are initiated inside the event processing system but which are actually executed by external actuators, and discuss the implications of these particularities on composite actions. Based on these observations, we propose versatile complex actions with temporal dependencies and a seamless integration of complex events and external actions. This article also investigates how the proposed integrated approach towards complex events and complex actions can be evaluated based on simple reactive rules. Finally, it is shown how complex actions can be deployed for a complex event processing system devoted to emergency management

    A data model for operational and situational information in emergency response: the Dutch case

    Get PDF
    During emergency response a lot of dynamic information is created and needs to be studied and analysed in the decision-making process. However, this analysis of data is difficult and often not possible. A major reason for this is that a lot of information coming from the field operations is not archived in a structured way. This paper presents a data model for the management of dynamic data, which captures the situational information (incident and its effect) and the operational information (processes activated and people/departments involved). The model is derived from the emergency response procedure and structural organisation in the Netherlands

    State-of-the-art on evolution and reactivity

    Get PDF
    This report starts by, in Chapter 1, outlining aspects of querying and updating resources on the Web and on the Semantic Web, including the development of query and update languages to be carried out within the Rewerse project. From this outline, it becomes clear that several existing research areas and topics are of interest for this work in Rewerse. In the remainder of this report we further present state of the art surveys in a selection of such areas and topics. More precisely: in Chapter 2 we give an overview of logics for reasoning about state change and updates; Chapter 3 is devoted to briefly describing existing update languages for the Web, and also for updating logic programs; in Chapter 4 event-condition-action rules, both in the context of active database systems and in the context of semistructured data, are surveyed; in Chapter 5 we give an overview of some relevant rule-based agents frameworks

    Industrial implementation of intelligent system techniques for nuclear power plant condition monitoring

    Get PDF
    As the nuclear power plants within the UK age, there is an increased requirement for condition monitoring to ensure that the plants are still be able to operate safely. This paper describes the novel application of Intelligent Systems (IS) techniques to provide decision support to the condition monitoring of Nuclear Power Plant (NPP) reactor cores within the UK. The resulting system, BETA (British Energy Trace Analysis) is deployed within the UK’s nuclear operator and provides automated decision support for the analysis of refuelling data, a lead indicator of the health of AGR (Advanced Gas-cooled Reactor) nuclear power plant cores. The key contribution of this work is the improvement of existing manual, labour-intensive analysis through the application of IS techniques to provide decision support to NPP reactor core condition monitoring. This enables an existing source of condition monitoring data to be analysed in a rapid and repeatable manner, providing additional information relating to core health on a more regular basis than routine inspection data allows. The application of IS techniques addresses two issues with the existing manual interpretation of the data, namely the limited availability of expertise and the variability of assessment between different experts. Decision support is provided by four applications of intelligent systems techniques. Two instances of a rule-based expert system are deployed, the first to automatically identify key features within the refuelling data and the second to classify specific types of anomaly. Clustering techniques are applied to support the definition of benchmark behaviour, which is used to detect the presence of anomalies within the refuelling data. Finally data mining techniques are used to track the evolution of the normal benchmark behaviour over time. This results in a system that not only provides support for analysing new refuelling events but also provides the platform to allow future events to be analysed. The BETA system has been deployed within the nuclear operator in the UK and is used at both the engineering offices and on station to support the analysis of refuelling events from two AGR stations, with a view to expanding it to the rest of the fleet in the near future
    corecore