734 research outputs found

    Transcription of adenovirus cores in vitro: major RNA products differ from those made from a DNA template.

    Full text link

    Open source procedure for assessment of loss using global earthquake modelling software (OPAL)

    Get PDF
    This paper provides a comparison between Earthquake Loss Estimation (ELE) software packages and their application using an “Open Source Procedure for Assessment of Loss using Global Earthquake Modelling software” (OPAL). The OPAL procedure was created to provide a framework for optimisation of a Global Earthquake Modelling process through: 1. overview of current and new components of earthquake loss assessment (vulnerability, hazard, exposure, specific cost, and technology); 2. preliminary research, acquisition, and familiarisation for available ELE software packages; 3. assessment of these software packages in order to identify the advantages and disadvantages of the ELE methods used; and 4. loss analysis for a deterministic earthquake (Mw = 7.2) for the Zeytinburnu district, Istanbul, Turkey, by applying 3 software packages (2 new and 1 existing): a modified displacement-based method based on DBELA (Displacement Based Earthquake Loss Assessment, Crowley et al., 2006), a capacity spectrum based method HAZUS (HAZards United States, FEMA, USA, 2003) and the Norwegian HAZUS-based SELENA (SEismic Loss EstimatioN using a logic tree Approach, Lindholm et al., 2007)software which was adapted for use in order to compare the different processes needed for the production of damage, economic, and social loss estimates. The modified DBELA procedure was found to be more computationally expensive, yet had less variability, indicating the need for multi-tier approaches to global earthquake loss estimation. Similar systems planning and ELE software produced through the OPAL procedure can be applied to worldwide applications, given exposure data

    Blood Lead Levels and Risk Factors for Lead Exposure in a Pediatric Population in Ho Chi Minh City, Vietnam.

    Get PDF
    Although lead recycling activities are a known risk factor for elevated blood levels in South East Asia, little is known regarding the prevalence of and risk factors for elevated blood lead levels (BLL) among the general pediatric population in Vietnam. This study is a cross-sectional evaluation of 311 children from Children's Hospital #2 in Ho Chi Minh City, Vietnam. Capillary blood lead testing was performed using the LeadCare II. Mean BLLs were 4.97 ÎŒg/dL (Standard Deviation (SD) 5.50), with 7% of the participants having levels greater than 10 ÎŒg/dL. Living in Bing Duong province (OR 2.7, 95% CI 1.4-5.6.1) or the Dong Nai province (OR 2.3, 95% CI 1.0-5.1) and having an age greater than 12 months (OR 6.0, 95% CI 3.1-11.8) were associated with higher BLLs. The prevalence of elevated BLLs in Vietnam is consistent with other SE Asian countries. Mean BLLs in Ho Chi Minh City are markedly less than those seen in a separate study of children living near lead recycling activities. Additional evaluation is necessary to better detail potential risk factors if screening is to be implemented within Vietnam

    Why We Can No Longer Ignore Consecutive Disasters

    Get PDF
    In recent decades, a striking number of countries have suffered from consecutive disasters: events whose impacts overlap both spatially and temporally, while recovery is still under way. The risk of consecutive disasters will increase due to growing exposure, the interconnectedness of human society, and the increased frequency and intensity of nontectonic hazard. This paper provides an overview of the different types of consecutive disasters, their causes, and impacts. The impacts can be distinctly different from disasters occurring in isolation (both spatially and temporally) from other disasters, noting that full isolation never occurs. We use existing empirical disaster databases to show the global probabilistic occurrence for selected hazard types. Current state‐of‐the art risk assessment models and their outputs do not allow for a thorough representation and analysis of consecutive disasters. This is mainly due to the many challenges that are introduced by addressing and combining hazards of different nature, and accounting for their interactions and dynamics. Disaster risk management needs to be more holistic and codesigned between researchers, policy makers, first responders, and companies

    The CATDAT damaging earthquakes database

    Get PDF
    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. <br><br> Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. <br><br> Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). <br><br> Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. <br><br> This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field

    The Asynergies of Structural Disaster Risk Reduction Measures: Comparing Floods and Earthquakes

    Get PDF
    Traditionally, building‐level disaster risk reduction (DRR) measures are aimed at a single natural hazard. However, in many countries the society faces the threat of multiple hazards. Building‐level DRR measures that aim to decrease earthquake vulnerability can have opposing or conflicting effects on flood vulnerability, and vice versa. In a case study of Afghanistan, we calculate the risk of floods and earthquakes, in terms of average annual losses (AAL), in the current situation. Next, we develop two DRR scenarios, where building‐level measures to reduce flood and earthquake risk are implemented. We use this to identify districts for which DRR measures of one hazard increase the risk of another hazard. We then also calculate the optimal situation between the two scenarios by, for each district, selecting the DRR scenario for which the AAL as a ratio of the total exposure is lowest. Finally, we assess the sensitivity of the total risk to each scenario. The optimal measure differs spatially throughout Afghanistan, but in most districts it is more beneficial to take flood DRR measures. However, in the districts where it is more beneficial to take earthquake measures, the reduction in risk is considerable (up to 40%, while flood DRR measures lead to a reduction in risk by 16% in individual districts). The introduction of asynergies between DRR measures in risk analyses allows policy‐makers to spatially differentiate building codes and other building‐level DRR measures to address the most prevalent risk while not compromising the risk resulting from other hazards

    Using rapid damage observations for Bayesian updating of hurricane vulnerability functions: A case study of Hurricane Dorian using social media

    Get PDF
    Rapid impact assessments immediately after disasters are crucial to enable rapid and effective mobilization of resources for response and recovery efforts. These assessments are often performed by analysing the three components of risk: hazard, exposure and vulnerability. Vulnerability curves are often constructed using historic insurance data or expert judgments, reducing their applicability for the characteristics of the specific hazard and building stock. Therefore, this paper outlines an approach to the creation of event-specific vulnerability curves, using Bayesian statistics (i.e., the zero-one inflated beta distribution) to update a pre-existing vulnerability curve (i.e., the prior) with observed impact data derived from social media. The approach is applied in a case study of Hurricane Dorian, which hit the Bahamas in September 2019. We analysed footage shot predominantly from unmanned aerial vehicles (UAVs) and other airborne vehicles posted on YouTube in the first 10 days after the disaster. Due to its Bayesian nature, the approach can be used regardless of the amount of data available as it balances the contribution of the prior and the observations

    Improving mortality rate estimates for management of the Queensland saucer scallop fishery

    Get PDF
    This research was undertaken on the Queensland saucer scallop (Ylistrum balloti) fishery in southeast Queensland, which is an important component of the Queensland East Coast Otter Trawl Fishery (QECOTF). The research was undertaken by a collaborative team from the Queensland Department of Agriculture and Fisheries, James Cook University (JCU) and the Centre for Applications in Natural Resource Mathematics (CARM), University of Queensland and focused on 1) an annual fishery-independent trawl survey of scallop abundance, 2) relationships between scallop abundance and physical properties of the seafloor, and 3) deriving an updated estimate of the scallop’s natural mortality rate. The scallop fishery used to be one of the state’s most valuable commercially fished stocks with the annual catch peak at just under 2000 t (adductor muscle meat-weight) in 1993 valued at about $30 million, but in recent years the stock has declined and is currently considered to be overfished. Results from the study are used to improve monitoring, stock assessment and management advice for the fishery

    The effects of arbuscular mycorrhizal fungal colonisation on nutrient status, growth, productivity, and canker resistance of apple (Malus pumila)

    Get PDF
    We assess whether arbuscular mycorrhizal fungi (AMF) improve growth, nutritional status, phenology, flower and fruit production, and disease resistance in woody perennial crops using apple (Malus pumila) as a study system. In a fully factorial experiment, young trees were grown for 3 years with or without AMF (Funneliformis mosseae and Rhizophagus irregularis), and with industrial standard fertiliser applications or restricted fertiliser (10% of standard). We use two commercial scions (Dabinett and Michelin) and rootstocks (MM111 and MM106). Industrial standard fertiliser applications reduced AMF colonisation and root biomass, potentially increasing drought sensitivity. Mycorrhizal status was influenced by above ground genotypes (scion type) but not rootstocks, indicating strong interactions between above and below ground plant tissue. The AMF inoculation significantly increased resistance to Neonectria ditissima, a globally economically significant fungal pathogen of apple orchards, but did not consistently alter leaf nutrients, growth, phenology or fruit and flower production. This study significantly advances understanding of AMF benefits to woody perennial crops, especially increased disease resistance which we show is not due to improved tree nutrition or drought alleviation. Breeding programmes and standard management practises can limit the potential for these benefits

    Investigation of superstorm Sandy 2012 in a multi-disciplinary approach

    Get PDF
    At the end of October 2012, Hurricane Sandy moved from the Caribbean Sea into the Atlantic Ocean and entered the United States not far from New York. Along its track, Sandy caused more than 200 fatalities and severe losses in Jamaica, The Bahamas, Haiti, Cuba, and the US. This paper demonstrates the capability and potential for near-real-time analysis of catastrophes. It is shown that the impact of Sandy was driven by the superposition of different extremes (high wind speeds, storm surge, heavy precipitation) and by cascading effects. In particular the interaction between Sandy and an extra-tropical weather system created a huge storm that affected large areas in the US. It is examined how Sandy compares to historic hurricane events, both from a hydro-meteorological and impact perspective. The distribution of losses to different sectors of the economy is calculated with simple input-output models as well as government estimates. Direct economic losses are estimated about USD 4.2 billion in the Caribbean and between USD 78 and 97 billion in the US. Indirect economic losses from power outages is estimated in the order of USD 16.3 billion. Modelling sector-specific dependencies quantifies total business interruption losses between USD 10.8 and 15.5 billion. Thus, seven years after the record impact of Hurricane Katrina in 2005, Hurricane Sandy is the second costliest hurricane in the history of the United States
    • 

    corecore