41 research outputs found

    Development and assessment of uni- and multivariable flood loss models for Emilia-Romagna (Italy)

    Get PDF
    Flood loss models are one important source of uncertainty in flood risk assessments. Many countries experience sparseness or absence of comprehensive high-quality flood loss data, which is often rooted in a lack of protocols and reference procedures for compiling loss datasets after flood events. Such data are an important reference for developing and validating flood loss models. We consider the Secchia River flood event of January 2014, when a sudden levee breach caused the inundation of nearly 52km2 in northern Italy. After this event local authorities collected a comprehensive flood loss dataset of affected private households including building footprints and structures and damages to buildings and contents. The dataset was enriched with further information compiled by us, including economic building values, maximum water depths, velocities and flood durations for each building. By analyzing this dataset we tackle the problem of flood damage estimation in Emilia-Romagna (Italy) by identifying empirical uni- and multivariable loss models for residential buildings and contents. The accuracy of the proposed models is compared with that of several flood damage models reported in the literature, providing additional insights into the transferability of the models among different contexts. Our results show that (1) even simple univariable damage models based on local data are significantly more accurate than literature models derived for different contexts; (2) multivariable models that consider several explanatory variables outperform univariable models, which use only water depth. However, multivariable models can only be effectively developed and applied if sufficient and detailed information is available

    Testing empirical and synthetic flood damage models: The case of Italy

    Get PDF
    Flood risk management generally relies on economic assessments performed by using flood loss models of different complexity, ranging from simple univariable models to more complex multivariable models. The latter account for a large number of hazard, exposure and vulnerability factors, being potentially more robust when extensive input information is available. We collected a comprehensive data set related to three recent major flood events in northern Italy (Adda 2002, Bacchiglione 2010 and Secchia 2014), including flood hazard features (depth, velocity and duration), building characteristics (size, type, quality, economic value) and reported losses. The objective of this study is to compare the performances of expert-based and empirical (both uni- and multivariable) damage models for estimating the potential economic costs of flood events to residential buildings. The performances of four literature flood damage models of different natures and complexities are compared with those of univariable, bivariable and multivariable models trained and tested by using empirical records from Italy. The uni- and bivariable models are developed by using linear, logarithmic and square root regression, whereas multivariable models are based on two machine-learning techniques: random forest and artificial neural networks. Results provide important insights about the choice of the damage modelling approach for operational disaster risk management. Our findings suggest that multivariable models have better potential for producing reliable damage estimates when extensive ancillary data for flood event characterisation are available, while univariable models can be adequate if data are scarce. The analysis also highlights that expert-based synthetic models are likely better suited for transferability to other areas compared to empirically based flood damage models

    Bayesian Data-Driven approach enhances synthetic flood loss models

    Get PDF
    Flood loss estimation models are developed using synthetic or empirical approaches. The synthetic approach consists of what-if scenarios developed by experts. The empirical models are based on statistical analysis of empirical loss data. In this study, we propose a novel Bayesian Data-Driven approach to enhance established synthetic models using available empirical data from recorded events. For five case studies in Western Europe, the resulting Bayesian Data-Driven Synthetic (BDDS) model enhances synthetic model predictions by reducing the prediction errors and quantifying the uncertainty and reliability of loss predictions for post-event scenarios and future events. The performance of the BDDS model for a potential future event is improved by integration of empirical data once a new flood event affects the region. The BDDS model, therefore, has high potential for combining established synthetic models with local empirical loss data to provide accurate and reliable flood loss predictions for quantifying future risk

    Quantifying the effects of nature-based solutions in reducing risks from hydrometeorological hazards: Examples from Europe

    Get PDF
    The combination of climate change and social and ecological factors will increase risks societies face from hydrometeorological hazards (HMH). Reducing these risks is typically achieved through the deployment of engineered (or grey) infrastructure but increasingly, nature-based so-lutions (NBS) are being considered. Most risk assessment frameworks do not allow capturing well the role NBS can play in addressing all components of risk, i.e., the hazard characteristics and the exposure and vulnerability of social-ecological systems. Recently, the Vulnerability and Risk as-sessment framework developed to allow the assessment of risks in the context of NBS implemen-tation (VR-NBS framework) was proposed. Here, we carry out the first implementation of this framework using five case study areas in Europe which are exposed to various HMH. Our results show that we can demonstrate the effect NBS have in terms of risk reduction and that this can be achieved by using a flexible library of indicators that allows to capture the specificities of each case study hazard, social and ecological circumstances. The approach appears to be more effec-tive for larger case study areas, but further testing is required in a broader variety of contexts

    Testing empirical and synthetic flood damage models: the case of Italy

    Get PDF
    Flood risk management generally relies on economic assessments performed by using flood loss models of different complexity, ranging from simple univariable models to more complex multivariable models. The latter account for a large number of hazard, exposure and vulnerability factors, being potentially more robust when extensive input information is available. We collected a comprehensive data set related to three recent major flood events in northern Italy (Adda 2002, Bacchiglione 2010 and Secchia 2014), including flood hazard features (depth, velocity and duration), building characteristics (size, type, quality, economic value) and reported losses. The objective of this study is to compare the performances of expert-based and empirical (both uni- and multivariable) damage models for estimating the potential economic costs of flood events to residential buildings. The performances of four literature flood damage models of different natures and complexities are compared with those of univariable, bivariable and multivariable models trained and tested by using empirical records from Italy. The uni- and bivariable models are developed by using linear, logarithmic and square root regression, whereas multivariable models are based on two machine-learning techniques: random forest and artificial neural networks. Results provide important insights about the choice of the damage modelling approach for operational disaster risk management. Our findings suggest that multivariable models have better potential for producing reliable damage estimates when extensive ancillary data for flood event characterisation are available, while univariable models can be adequate if data are scarce. The analysis also highlights that expert-based synthetic models are likely better suited for transferability to other areas compared to empirically based flood damage models.</p

    The Brazilian policy of withholding treatment for ADHD is probably increasing health and social costs

    Get PDF
    Objective: To estimate the economic consequences of the current Brazilian government policy for attention-deficit/hyperactivity disorder (ADHD) treatment and how much the country would save if treatment with immediate-release methylphenidate (MPH-IR), as suggested by the World Health Organization (WHO), was offered to patients with ADHD. Method: Based on conservative previous analyses, we assumed that 257,662 patients aged 5 to 19 years are not receiving ADHD treatment in Brazil. We estimated the direct costs and savings of treating and not treating ADHD on the basis of the following data: a) spending on ADHD patients directly attributable to grade retention and emergency department visits; and b) savings due to impact of ADHD treatment on these outcomes. Results: Considering outcomes for which data on the impact of MPH-IR treatment are available, Brazil is probably wasting approximately R1.841billion/yearonthedirectconsequencesofnottreatingADHDinthisagerangealone.Ontheotherhand,treatingADHDinaccordancewithWHOrecommendationswouldsaveapproximatelyR 1.841 billion/year on the direct consequences of not treating ADHD in this age range alone. On the other hand, treating ADHD in accordance with WHO recommendations would save approximately R 1.163 billion/year. Conclusions: By increasing investments on MPH-IR treatment for ADHD to around R$ 377 million/year, the country would save approximately 3.1 times more than is currently spent on the consequences of not treating ADHD in patients aged 5 to 19 years

    The Changing Landscape for Stroke\ua0Prevention in AF: Findings From the GLORIA-AF Registry Phase 2

    Get PDF
    Background GLORIA-AF (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients with Atrial Fibrillation) is a prospective, global registry program describing antithrombotic treatment patterns in patients with newly diagnosed nonvalvular atrial fibrillation at risk of stroke. Phase 2 began when dabigatran, the first non\u2013vitamin K antagonist oral anticoagulant (NOAC), became available. Objectives This study sought to describe phase 2 baseline data and compare these with the pre-NOAC era collected during phase&nbsp;1. Methods During phase 2, 15,641 consenting patients were enrolled (November 2011 to December 2014); 15,092 were eligible. This pre-specified cross-sectional analysis describes eligible patients\u2019 baseline characteristics. Atrial fibrillation&nbsp;disease characteristics, medical outcomes, and concomitant diseases and medications were collected. Data were analyzed using descriptive statistics. Results Of the total patients, 45.5% were female; median age was 71 (interquartile range: 64, 78) years. Patients were from Europe (47.1%), North America (22.5%), Asia (20.3%), Latin America (6.0%), and the Middle East/Africa (4.0%). Most had high stroke risk (CHA2DS2-VASc [Congestive heart failure, Hypertension, Age&nbsp; 6575 years, Diabetes mellitus, previous Stroke, Vascular disease, Age 65 to 74 years, Sex category] score&nbsp; 652; 86.1%); 13.9% had moderate risk (CHA2DS2-VASc&nbsp;= 1). Overall, 79.9% received oral anticoagulants, of whom 47.6% received NOAC and 32.3% vitamin K antagonists (VKA); 12.1% received antiplatelet agents; 7.8% received no antithrombotic treatment. For comparison, the proportion of phase 1 patients (of N&nbsp;= 1,063 all eligible) prescribed VKA was 32.8%, acetylsalicylic acid 41.7%, and no therapy 20.2%. In Europe in phase 2, treatment with NOAC was more common than VKA (52.3% and 37.8%, respectively); 6.0% of patients received antiplatelet treatment; and 3.8% received no antithrombotic treatment. In North America, 52.1%, 26.2%, and 14.0% of patients received NOAC, VKA, and antiplatelet drugs, respectively; 7.5% received no antithrombotic treatment. NOAC use was less common in Asia (27.7%), where 27.5% of patients received VKA, 25.0% antiplatelet drugs, and 19.8% no antithrombotic treatment. Conclusions The baseline data from GLORIA-AF phase 2 demonstrate that in newly diagnosed nonvalvular atrial fibrillation patients, NOAC have been highly adopted into practice, becoming more frequently prescribed than VKA in&nbsp;Europe and North America. Worldwide, however, a large proportion of patients remain undertreated, particularly in&nbsp;Asia&nbsp;and North America. (Global Registry on Long-Term Oral Antithrombotic Treatment in Patients With Atrial Fibrillation [GLORIA-AF]; NCT01468701
    corecore