38 research outputs found

    The credibility challenge for global fluvial flood risk analysis

    Get PDF
    Quantifying flood hazard is an essential component of resilience planning, emergency response, and mitigation, including insurance. Traditionally undertaken at catchment and national scales, recently, efforts have intensified to estimate flood risk globally to better allow consistent and equitable decision making. Global flood hazard models are now a practical reality, thanks to improvements in numerical algorithms, global datasets, computing power, and coupled modelling frameworks. Outputs of these models are vital for consistent quantification of global flood risk and in projecting the impacts of climate change. However, the urgency of these tasks means that outputs are being used as soon as they are made available and before such methods have been adequately tested. To address this, we compare multi-probability flood hazard maps for Africa from six global models and show wide variation in their flood hazard, economic loss and exposed population estimates, which has serious implications for model credibility. While there is around 30-40% agreement in flood extent, our results show that even at continental scales, there are significant differences in hazard magnitude and spatial pattern between models, notably in deltas, arid/semi-arid zones and wetlands. This study is an important step towards a better understanding of modelling global flood hazard, which is urgently required for both current risk and climate change projections

    How much do we really know about river flooding?

    Get PDF
    Have you ever experienced rain where it rained so hard or for so long that you feared you may soon be up to your eyeballs in water? Sadly, many people in the world have witnessed this firsthand, and this is likely to increase due to climate change unless we do something to prepare for flooding disasters. Some areas are more prone to floods than others, and the people living there are more at risk. Scientists have developed computer models in an effort to map flood prone areas. Decision makers use the results from those computer models to plan for future flooding events to limit destruction and save lives. But are they accurate enough considering human lives may depend on them? To answer this question we compared the results from six computer models which simulate flood risk in Africa. The models agreed in less than 40% of the cases about where exactly it would flood and how much damage there might be

    Corrigendum: The credibility challenge for global fluvial flood risk analysis (vol 11, 094014, 2016)

    Get PDF
    The formula in the article for Model Agreement Index is incorrect. It has an extra N in the denominator that should not be there. The correct formulae for the Model Agreement Index is shown below. The analyses presented in the paper were conducted with the correct formula

    Standardization: a necessity for the documentation and archiving in cultural heritage

    No full text
    The authors will give an overview of the State of the Art in the field of Standardization in the area of Cultural Heritage worldwide. This is especially important due to the fact that Cultural Heritage is currently being influenced by computer technology and utilizing the advantages of digital documentation along with the reconstruction of the past taking on a 3D form. Focusing on the advantages and disadvantages of the modern Information Technology (IT) tools, it will be demonstrated how user dependent data can cause many critical situations. The revolution of IT and the continuous expansion of this technology has set the experts of Cultural Heritage under massive pressure to become familiar with and use the computer technology available. Cultural Heritage data and information has to be reliably read, sorted, indexed, manipulated, retrieved, and communicated between systems nationally and internationally. The use of IT is highly encouraged and has proven itself a vital tool. However, at its present state,“island solutions” have emerged limiting the study area of the researcher which leads to the incompatibility of cataloguing, archiving, presenting and conserving archaeological artefacts, monuments and sites in a unified worldwide format. A” standard”, in Information Technology can be defined as a set of regulations for the guarantee of the protection of the long-term value of digital data for the storage, exchange, sharing, searching and retrieval of information between different users/professionals around the world using the global computer network (Internet) and different Hardware and Software structures. Based on specific examples, the advantages of standardization and dangers of non-standardization of the globalization of e-documentation and e-archiving in Cultural Heritage in the areas of e-libraries and e-museums will be demonstrated and discussed

    Total Order Communications: A Practical Analysis

    No full text
    Abstract. Total Order (TO) broadcast is a widely used communication abstraction that has been deeply investigated during the last decade. As such, the amount of relevant works may leave practitioners wondering how to select the TO implementation that best fits the requirements of their applications. Different implementations are indeed available, each providing distinct safety guarantees and performance. These aspects must be considered together in order to build a correct and sufficiently performing application. To this end, this paper analyzes six TO implementations embedded in three freely-distributed group communication systems, namely Ensemble, Spread and JavaGroups. Implementations are first classified according to the enforced specifications, which is given using a framework for specification tailored to total order communications. Then, implementations are compared under the performance viewpoint in a simple yet meaningful deployment scenario. In our opinion, this structured information should assist practitioners (i) in deeply understanding the ways in which implementations may differ (specifications, performance) and (ii) in quickly relating a set of total order algorithms to their specifications, implementations and performance.
    corecore