153 research outputs found

    Building typological classification in Switzerland using deep learning methods for seismic assessment

    Get PDF
    Natural disasters, such as earthquakes, have always represented a danger to human life. Seismic risk assessment consists of the evaluation of existing buildings and their expected response in case of an earthquake; the exposure model of buildings plays a key role in risk calculations. With this respect, in recent years, advanced techniques have been developed to speed up and automatize the processes of data acquisition to data interpretation, although it is worth mentioning that the visual survey is essential to train and validate Machine Learning (ML) methods. In the present study, the identification of building types is conducted by exploiting the traditional visual survey to implement a Deep Learning (DL) classification model. As a first step, city mapping schemes are obtained by classifying buildings according to the main features (i.e., construction period and height classes). Then, Random Forest (RF), a supervised learning algorithm, is applied to classify different building types by exploiting all their attributes. The RF model is trained and tested on the cities of Neuchatel and Yverdon-Les-Bains. The decent accuracy of the results encourages the application of the method to different cities, with proper adjustments in datasets, features and algorithms

    Large-scale features of Last Interglacial climate: Results from evaluating the lig127k simulations for the Coupled Model Intercomparison Project (CMIP6)-Paleoclimate Modeling Intercomparison Project (PMIP4)

    Get PDF
    Abstract. The modeling of paleoclimate, using physically based tools, is increasingly seen as a strong out-of-sample test of the models that are used for the projection of future climate changes. New to the Coupled Model Intercomparison Project (CMIP6) is the Tier 1 Last Interglacial experiment for 127 000 years ago (lig127k), designed to address the climate responses to stronger orbital forcing than the midHolocene experiment, using the same state-of-the-art models as for the future and following a common experimental protocol. Here we present a first analysis of a multi-model ensemble of 17 climate models, all of which have completed the CMIP6 DECK (Diagnostic, Evaluation and Characterization of Klima) experiments. The equilibrium climate sensitivity (ECS) of these models varies from 1.8 to 5.6 ∘C. The seasonal character of the insolation anomalies results in strong summer warming over the Northern Hemisphere continents in the lig127k ensemble as compared to the CMIP6 piControl and much-reduced minimum sea ice in the Arctic. The multi-model results indicate enhanced summer monsoonal precipitation in the Northern Hemisphere and reductions in the Southern Hemisphere. These responses are greater in the lig127k than the CMIP6 midHolocene simulations as expected from the larger insolation anomalies at 127 than 6 ka. New synthesis for surface temperature and precipitation, targeted for 127 ka, have been developed for comparison to the multi-model ensemble. The lig127k model ensemble and data reconstructions are in good agreement for summer temperature anomalies over Canada, Scandinavia, and the North Atlantic and for precipitation over the Northern Hemisphere continents. The model–data comparisons and mismatches point to further study of the sensitivity of the simulations to uncertainties in the boundary conditions and of the uncertainties and sparse coverage in current proxy reconstructions. The CMIP6–Paleoclimate Modeling Intercomparison Project (PMIP4) lig127k simulations, in combination with the proxy record, improve our confidence in future projections of monsoons, surface temperature, and Arctic sea ice, thus providing a key target for model evaluation and optimization. </jats:p

    Large-scale features of Last Interglacial climate: results from evaluating the lig127k simulations for the Coupled Model Intercomparison Project (CMIP6)–Paleoclimate Modeling Intercomparison Project (PMIP4)

    Get PDF
    The modeling of paleoclimate, using physically based tools, is increasingly seen as a strong out-of-sample test of the models that are used for the projection of future climate changes. New to the Coupled Model Intercomparison Project (CMIP6) is the Tier 1 Last Interglacial experiment for 127 000 years ago (lig127k), designed to address the climate responses to stronger orbital forcing than the midHolocene experiment, using the same state-of-the-art models as for the future and following a common experimental protocol. Here we present a first analysis of a multi-model ensemble of 17 climate models, all of which have completed the CMIP6 DECK (Diagnostic, Evaluation and Characterization of Klima) experiments. The equilibrium climate sensitivity (ECS) of these models varies from 1.8 to 5.6 ∘C. The seasonal character of the insolation anomalies results in strong summer warming over the Northern Hemisphere continents in the lig127k ensemble as compared to the CMIP6 piControl and much-reduced minimum sea ice in the Arctic. The multi-model results indicate enhanced summer monsoonal precipitation in the Northern Hemisphere and reductions in the Southern Hemisphere. These responses are greater in the lig127k than the CMIP6 midHolocene simulations as expected from the larger insolation anomalies at 127 than 6 ka. New synthesis for surface temperature and precipitation, targeted for 127 ka, have been developed for comparison to the multi-model ensemble. The lig127k model ensemble and data reconstructions are in good agreement for summer temperature anomalies over Canada, Scandinavia, and the North Atlantic and for precipitation over the Northern Hemisphere continents. The model–data comparisons and mismatches point to further study of the sensitivity of the simulations to uncertainties in the boundary conditions and of the uncertainties and sparse coverage in current proxy reconstructions. The CMIP6–Paleoclimate Modeling Intercomparison Project (PMIP4) lig127k simulations, in combination with the proxy record, improve our confidence in future projections of monsoons, surface temperature, and Arctic sea ice, thus providing a key target for model evaluation and optimization

    Age‐Depth Models for Tropical Marine Hemipelagic Deposits Improve Significantly When Proxy‐Based Information on Sediment Composition Is Included

    No full text
    Accurate age-depth models for marine sediment cores are crucial for understanding of paleo-oceanographic and -climatic changes derived from these archives. To date, information on bulk sediment composition is largely ignored as a potential source of information to improve age-depth models. Here, we explore how bulk sediment composition can be used qualitatively to improve age-depth models. We developed the BomDia, algorithm, which produces age-depth models with realistic sediment accumulation rates that co-vary in harmony with the bulk sediment composition. We demonstrate that changes in the marine versus terrigenous sediment deposition based on bulk sediment composition, can be used to significantly improve age-depth models of hemipelagic marine deposits. Based on two marine records - each containing more than twenty radiocarbon (AMS 14C) dated levels - we show that the mean error of prediction of unused AMS 14C ages significantly improves from 3.9% using simple linear interpolation, to 2.4% (p = 0.003) when bulk sediment composition is included. The BomDia age modelling approach provides a powerful statistical tool to assess the validity of age control points used and also may assist in the detection of hiatuses. Testing and further development of the BomDia algorithm may be needed for application in other than hemipelagic depositional settings. Key Points The accuracy and predictive quality of age-depth models for marine hemipelagic cores improves when bulk sediment composition is included The BomDia algorithm produces age-depth models with realistic accumulation rates which co-vary with bulk sediment composition The Bomdia algorithm uses a Monte Carlo approach to assess age-depth model uncertainty Plain Language Summary The age-depth relationship in a marine sediment core is known as an age-depth model. To build an age-depth model, that is to produce a continuous age-depth relationship, an accurate and precise prediction of ages in between age control points is needed. A new algorithm, which includes information on the bulk sediment composition in the prediction of ages, has been developed and tested for this purpose. The calcium carbonate content of the sediments is calibrated versus the log ratio of titanium and calcium as obtained from resolution X-ray fluorescence and used as model input for bulk sediment composition. We tested the algorithm using data from two tropical hemipelagic cores and find that the algorithm, which we refer to as BomDia (‘a good dynamic interpolation algorithm’), produces robust age-depth models with predictive power and realistic sedimentation rates that co-vary with sediment composition. Further application and testing of the algorithm to a wider variety of other than tropical hemipelagic sedimentary settings is recommended

    Digital Twinning for the Prognosis of Spatial Architectures: Morandi’s Underground Pavilion in Turin

    No full text
    Concrete spatial architecture was mainly built using techniques that at the time were still experimental and based on design criteria that did not consider seismic actions. The validity of accurate models accounting for such complex structural schemes can be demonstrated, but the latter still would not support a clear comparison with the original predictions. Different from other Morandi’s balanced beam schemes, in the underground Pavilion V of Turin Exhibition Center the main post-tensioned ribs are not parallel beams but are diagonally directed and multiply reciprocally interconnected in order to obtain a spatial structure offering a high overall rigidity and lateral stability, and to contrast the instability of the very thin webs of the main ribs. The paper focuses on how information from the experimental campaign can help to formulate virtual models for prognostic and diagnostic assessments under different scenarios, such as for example the design of structural health monitoring activities and systems

    FLOPROS: an evolving global database of flood protection standards

    No full text
    With projected changes in climate, population and socioeconomic activity located in flood-prone areas, the global assessment of flood risk is essential to inform climate change policy and disaster risk management. Whilst global flood risk models exist for this purpose, the accuracy of their results is greatly limited by the lack of information on the current standard of protection to floods, with studies either neglecting this aspect or resorting to crude assumptions. Here we present a first global database of FLOod PROtection Standards, FLOPROS, which comprises information in the form of the flood return period associated with protection measures, at different spatial scales. FLOPROS comprises three layers of information, and combines them into one consistent database. The <i>design</i> layer contains empirical information about the actual standard of existing protection already in place; the <i>policy</i> layer contains information on protection standards from policy regulations; and the <i>model</i> layer uses a validated modelling approach to calculate protection standards. The policy layer and the model layer can be considered adequate proxies for actual protection standards included in the design layer, and serve to increase the spatial coverage of the database. Based on this first version of FLOPROS, we suggest a number of strategies to further extend and increase the resolution of the database. Moreover, as the database is intended to be continually updated, while flood protection standards are changing with new interventions, FLOPROS requires input from the flood risk community. We therefore invite researchers and practitioners to contribute information to this evolving database by corresponding to the authors

    Science for Loss and Damage: Findings and Propositions

    No full text
    The debate on “Loss and Damage” (L&D) has gained traction over the last few years. Supported by growing scientific evidence of anthropogenic climate change amplifying frequency, intensity and duration of climate-related hazards as well as observed increases in climate-related impacts and risks in many regions, the “Warsaw International Mechanism for Loss and Damage” was established in 2013 and further supported through the Paris Agreement in 2015. Despite advances, the debate currently is broad, diffuse and somewhat confusing, while concepts, methods and tools, as well as directions for policy remain vague and often contested. This book, a joint effort of the Loss and Damage Network—a partnership effort by scientists and practitioners from around the globe—provides evidence-based insight into the L&D discourse by highlighting state-of-the-art research conducted across multiple disciplines, by showcasing applications in practice and by providing insight into policy contexts and salient policy options. This introductory chapter summarises key findings of the twenty-two book chapters in terms of five propositions. These propositions, each building on relevant findings linked to forward-looking suggestions for research, policy and practice, reflect the architecture of the book, whose sections proceed from setting the stage to critical issues, followed by a section on methods and tools, to chapters that provide geographic perspectives, and finally to a section that identifies potential policy options. The propositions comprise (1) Risk management can be an effective entry point for aligning perspectives and debates, if framed comprehensively, coupled with climate justice considerations and linked to established risk management and adaptation practice; (2) Attribution science is advancing rapidly and fundamental to informing actions to minimise, avert, and address losses and damages; (3) Climate change research, in addition to identifying physical/hard limits to adaptation, needs to more systematically examine soft limits to adaptation, for which we find some evidence across several geographies globally; (4) Climate risk insurance mechanisms can serve the prevention and cure aspects emphasised in the L&D debate but solidarity and accountability aspects need further attention, for which we find tentative indication in applications around the world; (5) Policy deliberations may need to overcome the perception that L&D constitutes a win-lose negotiation “game” by developing a more inclusive narrative that highlights collective ambition for tackling risks, mutual benefits and the role of transformation
    corecore