212 research outputs found
Systemic risks perspectives of Eyjafjallajökull volcano's 2010 eruption
In 2010, southern Iceland's Eyjafjallajökull volcano erupted, releasing ash that spread across Europe. Due to its potential to damage aircraft, much of European airspace was closed for six days. Known problems were brought to the forefront regarding the anticipation of and response to systemic risks. To contribute a deeper understanding of this situation, this paper explores this disaster through its fundamental causes and cascading impacts, highlighting perspectives from disaster risk reduction, complexity sciences, and health in order to support analysis and resolution of systemic risks. Two principal future directions emerge from this work. First, how to manage dependency on air travel. Second, how to think about and act to avert future calamities
Rapid emergency assessment of ash and gas hazard for future eruptions at Santorini Volcano, Greece
Hazard assessments for long-dormant volcanoes, where information is rarely available, typically have to be made rapidly and in the face of considerable uncertainty and often poor information. A conditional (assuming an eruption), scenario-based probabilistic approach to such an assessment is presented here for Santorini volcano (Greece). The rapid assessment was developed and implemented in response to the 2011-2012 unrest crisis in order to inform emergency management and planning. This paper synthesises the results presented to the Greek National Committee and scientific community involved. Two plausible eruptions at Santorini were investigated, using multiple inputs and dispersal models, based on observations of historic eruptions and expert judgement. For ash hazard, a 'most likely' eruption scenario was developed, characterised by slow lava extrusion over periods of one to two years with weak but persistent explosions and ash venting up to 3 km. A second 'largest considered' sub-Plinian explosive scenario assumed a 12 km high column of 4-h duration. For gas hazard, constant fluxes of 200 and 800 tons/day SO2 were assumed for the duration of the eruption scenarios, noting that there is very little evidence to constrain SO2 flux from Santorini eruptions. Statistical models of likely wind conditions with height and season were developed from decadal reanalysis time series showing that consistent low-altitude winds were rarely maintained for more than a few days. Stochastic models of ash (TEPHRA2, VOL-CALPUFF) and gas (AERMOD) dispersal provided outputs in the form of probability maps and exceedance probability curves for key loading and concentration thresholds at important locations on the island. The results from the rapid assessments presented in this paper confirm that ash and gas hazard is likely to be of concern if an eruption of Santorini occurs. Higher hazard may be expected to the south and east of the volcano, notably at important tourist and transport hubs. Low hazard to the north and northwest suggests that these may be suitable locations for emergency response centres and emergency critical infrastructure. This approach may provide a blueprint for rapid ash and gas assessment for other long-dormant volcanoes and we provide suggestions for refining the methods used.</p
Identifying analogues for data-limited volcanoes using hierarchical clustering and expert knowledge: a case study of Melimoyu (Chile)
Determining the eruption frequency-Magnitude (f-M) relationship for data-limited volcanoes is challenging since it requires a comprehensive eruption record of the past eruptive activity. This is the case for Melimoyu, a long-dormant and data-limited volcano in the Southern Volcanic Zone (SVZ) in Chile with only two confirmed Holocene eruptions (VEI 5). To supplement the eruption records, we identified analogue volcanoes for Melimoyu (i.e., volcanoes that behave similarly and are identified through shared characteristics) using a quantitative and objective approach. Firstly, we compiled a global database containing 181 variables describing the eruptive history, tectonic setting, rock composition, and morphology of 1,428 volcanoes. This database was filtered primarily based on data availability into an input dataset comprising 37 numerical variables for 438 subduction zone volcanoes. Then, we applied Agglomerative Nesting, a bottom-up hierarchical clustering algorithm on three datasets derived from the input dataset: 1) raw data, 2) output from a Principal Component Analysis, and 3) weighted data tuned to minimise the dispersion in the absolute probability per VEI. Lastly, we identified the best set of analogues by analysing the dispersion in the absolute probability per VEI and applying a set of criteria deemed important by the local geological service, SERNAGEOMIN, and VB. Our analysis shows that the raw data generate a low dispersion and the highest number of analogues (n = 20). More than half of these analogues are in the SVZ, suggesting that the tectonic setting plays a key role in the clustering analysis. The eruption f-M relationship modelled from the analogue’s eruption data shows that if Melimoyu has an eruption, there is a 49% probability (50th percentile) of it being VEI≥4. Meanwhile, the annual absolute probability of a VEI≤1, VEI 2, VEI 3, VEI 4, and VEI≥5 eruption at Melimoyu is 4.82 × 10−4, 1.2 × 10−3, 1.45 × 10−4, 9.77 × 10−4, and 8.3 × 10−4 (50th percentile), respectively. Our work shows the importance of using numerical variables to capture the variability across volcanoes and combining quantitative approaches with expert knowledge to assess the suitability of potential analogues. Additionally, this approach allows identifying groups of analogues and can be easily applied to other cases using numerical variables from the global database. Future work will use the analogues to populate an event tree and define eruption source parameters for modelling volcanic hazards at Melimoyu
Review article: Natural hazard risk assessments at the global scale
Since 1990, natural hazards have led to over 1.6 million fatalities globally, and economic losses are estimated at an average of around $260–310 billion per year. The scientific and policy community recognise the need to reduce these risks. As a result, the last decade has seen a rapid development of global models for assessing risk from natural hazards at the global scale. In this paper, we review the scientific literature on natural hazard risk assessments at the global scale, and specifically examine whether and how they have examined future projections of hazard, exposure, and/or vulnerability. In doing so, we examine similarities and differences between the approaches taken across the different hazards, and identify potential ways in which different hazard communities can learn from each other. For example, we show that global risk studies focusing on hydrological, climatological, and meteorological hazards, have included future projections and disaster risk reduction measures (in the case of floods), whilst these are missing in global studies related to geological hazards. The methods used for projecting future exposure in the former could be applied to the geological studies. On the other hand, studies of earthquake and tsunami risk are now using stochastic modelling approaches to allow for a fully probabilistic assessment of risk, which could benefit the modelling of risk from other hazards. Finally, we discuss opportunities for learning from methods and approaches being developed and applied to assess natural hazard risks at more continental or regional scales. Through this paper, we hope to encourage dialogue on knowledge sharing between scientists and communities working on different hazards and at different spatial scales
Natural hazard risk assessments at the global scale
Since 1990, natural hazards have led to over 1.6 million fatalities globally, and economic losses are estimated at an average of around USD 260–310 billion per year. The scientific and policy communities recognise the need to reduce these risks. As a result, the last decade has seen a rapid development of global models for assessing risk from natural hazards at the global scale. In this paper, we review the scientific literature on natural hazard risk assessments at the global scale, and we specifically examine whether and how they have examined future projections of hazard, exposure, and/or vulnerability. In doing so, we examine similarities and differences between the approaches taken across the different hazards, and we identify potential ways in which different hazard communities can learn from each other. For example, there are a number of global risk studies focusing on hydrological, climatological, and meteorological hazards that have included future projections and disaster risk reduction measures (in the case of floods), whereas fewer exist in the peer-reviewed literature for global studies related to geological hazards. On the other hand, studies of earthquake and tsunami risk are now using stochastic modelling approaches to allow for a fully probabilistic assessment of risk, which could benefit the modelling of risk from other hazards. Finally, we discuss opportunities for learning from methods and approaches being developed and applied to assess natural hazard risks at more continental or regional scales. Through this paper, we hope to encourage further dialogue on knowledge sharing between disciplines and communities working on different hazards and risk and at different spatial scales
Reconstructing eruptions at a data limited volcano: A case study at Gede (West Java)
Understanding past eruption dynamics at a volcano is crucial for forecasting the range of possible future eruptions and their associated hazards and risk. In this work we use numerical models to recreate the footprints of pyroclastic density currents (PDCs) and tephra fall from three eruptions at Gede volcano, Indonesia, with the aim of gaining further insight into these past eruptions and identifying suitable eruption source parameters for future hazard and risk assessment. Gede has the largest number of people living within 100 km of any volcano worldwide, and has exhibited recent unrest activity, yet little is known about its eruptive history. For PDCs, we used Titan2D to recreate geological deposits dated at 1.2 and c. 1 kyrs BP. An objective and quantitative multi-criteria method was developed to evaluate the fit of 342 model simulations with field observations. In recreating the field deposits we were able to identify the best fitting values to reconstruct these eruptions. We found that the 1.2 kyrs BP geological deposits could be reproduced with Titan2D using either a dome-collapse or a column-collapse as the triggering mechanism, although a relatively low basal friction angle of 6° would suggest that the PDCs were highly mobile. For the 1 kyrs BP PDC, a column-collapse mechanism and a higher basal friction angle were required to fit the geological deposits. In agreement with previous studies, we found that Titan2D simulations were most sensitive to the basal friction angle parameter. We used Tephra2 to recreate historic observations of tephra dispersed to Jakarta and Gunung Patuha during the last known magmatic eruption of Gede in 1948. In the absence of observable field deposits, or detailed information from the published literature, we stochastically sampled eruption source parameters from wide ranges informed by analogous volcanic systems, allowing us to constrain the eruption dynamics capable of dispersing tephra to the most populous city in Indonesia, Jakarta. Our modelling suggests that the deposition of tephra fall in Jakarta during the November 1948 eruption was a very low probability event, with a < 1% chance of occurrence. Through this work, we show how the reconstruction of past eruptions with numerical models can improve our understanding of past eruption dynamics, when faced with epistemic uncertainty. At Gede volcano, this provides a crucial step towards the reduction of risk to nearby populations through volcanic hazard assessment
Gamma Ray Bursts as Probes of the Distant Universe
We review recent results on the high-redshift universe and the cosmic
evolution obtained using Gamma Ray Bursts (GRBs) as tracers of high-redshift
galaxies. Most of the results come from photometric and spectroscopic
observations of GRB host galaxies once the afterglow has faded away but also
from the analysis of the GRB afterglow line of sight as revealed by absorptions
in their optical spectrum.Comment: 20 pages, 4 figures. To appear in a special issue of Comptes Rendus
Physique "GRB studies in the SVOM era", Eds. F. Daigne, G. Dubu
Estimates of live-tree carbon stores in the Pacific Northwest are sensitive to model selection
<p>Abstract</p> <p>Background</p> <p>Estimates of live-tree carbon stores are influenced by numerous uncertainties. One of them is model-selection uncertainty: one has to choose among multiple empirical equations and conversion factors that can be plausibly justified as locally applicable to calculate the carbon store from inventory measurements such as tree height and diameter at breast height (DBH). Here we quantify the model-selection uncertainty for the five most numerous tree species in six counties of northwest Oregon, USA.</p> <p>Results</p> <p>The results of our study demonstrate that model-selection error may introduce 20 to 40% uncertainty into a live-tree carbon estimate, possibly making this form of error the largest source of uncertainty in estimation of live-tree carbon stores. The effect of model selection could be even greater if models are applied beyond the height and DBH ranges for which they were developed.</p> <p>Conclusions</p> <p>Model-selection uncertainty is potentially large enough that it could limit the ability to track forest carbon with the precision and accuracy required by carbon accounting protocols. Without local validation based on detailed measurements of usually destructively sampled trees, it is very difficult to choose the best model when there are several available. Our analysis suggests that considering tree form in equation selection may better match trees to existing equations and that substantial gaps exist, in terms of both species and diameter ranges, that are ripe for new model-building effort.</p
The Kalanchoe genome provides insights into convergent evolution and building blocks of crassulacean acid metabolism
Crassulacean acid metabolism (CAM) is a water-use efficient adaptation of photosynthesis that has evolved independently many times in diverse lineages of flowering plants. We hypothesize that convergent evolution of protein sequence and temporal gene expression underpins the independent emergences of CAM from C3 photosynthesis. To test this hypothesis, we generate a de novo genome assembly and genome-wide transcript expression data for Kalanchoë fedtschenkoi, an obligate CAM species within the core eudicots with a relatively small genome (~260 Mb). Our comparative analyses identify signatures of convergence in protein sequence and re-scheduling of diel transcript expression of genes involved in nocturnal CO2 fixation, stomatal movement, heat tolerance, circadian clock, and carbohydrate metabolism in K. fedtschenkoi and other CAM species in comparison with non-CAM species. These findings provide new insights into molecular convergence and building blocks of CAM and will facilitate CAM-into-C3 photosynthesis engineering to enhance water-use efficiency in crops
Facilitating the development of controlled vocabularies for metabolomics technologies with text mining
BACKGROUND: Many bioinformatics applications rely on controlled vocabularies or ontologies to consistently interpret and seamlessly integrate information scattered across public resources. Experimental data sets from metabolomics studies need to be integrated with one another, but also with data produced by other types of omics studies in the spirit of systems biology, hence the pressing need for vocabularies and ontologies in metabolomics. However, it is time-consuming and non trivial to construct these resources manually. RESULTS: We describe a methodology for rapid development of controlled vocabularies, a study originally motivated by the needs for vocabularies describing metabolomics technologies. We present case studies involving two controlled vocabularies (for nuclear magnetic resonance spectroscopy and gas chromatography) whose development is currently underway as part of the Metabolomics Standards Initiative. The initial vocabularies were compiled manually, providing a total of 243 and 152 terms. A total of 5,699 and 2,612 new terms were acquired automatically from the literature. The analysis of the results showed that full-text articles (especially the Materials and Methods sections) are the major source of technology-specific terms as opposed to paper abstracts. CONCLUSIONS: We suggest a text mining method for efficient corpus-based term acquisition as a way of rapidly expanding a set of controlled vocabularies with the terms used in the scientific literature. We adopted an integrative approach, combining relatively generic software and data resources for time- and cost-effective development of a text mining tool for expansion of controlled vocabularies across various domains, as a practical alternative to both manual term collection and tailor-made named entity recognition methods
- …