6,156 research outputs found

    A new data assimilation procedure to develop a debris flow run-out model

    Get PDF
    Abstract Parameter calibration is one of the most problematic phases of numerical modeling since the choice of parameters affects the model\u2019s reliability as far as the physical problems being studied are concerned. In some cases, laboratory tests or physical models evaluating model parameters cannot be completed and other strategies must be adopted; numerical models reproducing debris flow propagation are one of these. Since scale problems affect the reproduction of real debris flows in the laboratory or specific tests used to determine rheological parameters, calibration is usually carried out by comparing in a subjective way only a few parameters, such as the heights of soil deposits calculated for some sections of the debris flows or the distance traveled by the debris flows using the values detected in situ after an event has occurred. Since no automatic or objective procedure has as yet been produced, this paper presents a numerical procedure based on the application of a statistical algorithm, which makes it possible to define, without ambiguities, the best parameter set. The procedure has been applied to a study case for which digital elevation models of both before and after an important event exist, implicating that a good database for applying the method was available. Its application has uncovered insights to better understand debris flows and related phenomena

    Application of Statistical Methods and Process Models for the Design and Analysis of Activated Sludge Wastewater Treatment Plants (WWTPs)

    Get PDF
    The purpose of this study is to investigate statistical procedures to qualify uncertainty, and explicitly evaluate its impact on wastewater treatment plants (WWTPs). The goal is to develop a statistical-based procedure to design WWTPs that provide reliable protection of water quality, instead of making overly conservative assumptions and adopting empirical safety factors. An innovative Monte Carlo based procedure was developed to quantify the risk of violating effluent as a function of various design decisions. A simulation program called StatASPS was developed to conduct Monte Carlo simulations combined with the ASM1 model. A random influent generator was developed to describe the statistical characteristics of the influent components of WWTPs. Prior to modeling, a two-directional exponential smoothing (TES) method was developed to replace those non-randomly missing data during weekends and holidays. The best models were selected based on various statistics and the ability to forecast future values. The time series models were then used to generate random influent variables with the same statistical characteristics as the original data. The best Monte Carlo simulations were conducted using historical influent data and site-specific parameter distributions, according to the applications to both the Oak Ridge and Seneca WWTPs. This indicates that parameter uncertainty was more effective in predicting uncertainty in plant performance than influent variability. The ultimate simulations were conducted using one-month’s influent data, considering limitations of computing technologies. Application of the method to the two plants demonstrated that this method provided a reliable and reasonable estimate of the uncertainty of plant performance. The best predictions of plant uncertainty were obtained by determining the distribution for the most sensitive parameter and holding all other model parameters constant. The StatASPS procedure proved to be a reliable and reasonable method to design cost-effective WWTPs. With further development, this procedure could provide engineers and regulators with a high degree of confidence that the plant will perform as required, without resorting to overly conservative assumptions or large safety factors

    Seismic Risk Analysis of Revenue Losses, Gross Regional Product and transportation systems.

    Get PDF
    Natural threats like earthquakes, hurricanes or tsunamis have shown seri- ous impacts on communities. In the past, major earthquakes in the United States like Loma Prieta 1989, Northridge 1994, or recent events in Italy like L’Aquila 2009 or Emilia 2012 earthquake emphasized the importance of pre- paredness and awareness to reduce social impacts. Earthquakes impacted businesses and dramatically reduced the gross regional product. Seismic Hazard is traditionally assessed using Probabilistic Seismic Hazard Anal- ysis (PSHA). PSHA well represents the hazard at a specific location, but it’s unsatisfactory for spatially distributed systems. Scenario earthquakes overcome the problem representing the actual distribution of shaking over a spatially distributed system. The performance of distributed productive systems during the recovery process needs to be explored. Scenario earthquakes have been used to assess the risk in bridge networks and the social losses in terms of gross regional product reduction. The proposed method for scenario earthquakes has been applied to a real case study: Treviso, a city in the North East of Italy. The proposed method for scenario earthquakes requires three models: one representation of the sources (Italian Seismogenic Zonation 9), one attenuation relationship (Sa- betta and Pugliese 1996) and a model of the occurrence rate of magnitudes (Gutenberg Richter). A methodology has been proposed to reduce thou- sands of scenarios to a subset consistent with the hazard at each location. Earthquake scenarios, along with Mote Carlo method, have been used to simulate business damage. The response of business facilities to earthquake has been obtained from fragility curves for precast industrial building. Fur- thermore, from business damage the reduction of productivity has been simulated using economic data from the National statistical service and a proposed piecewise “loss of functionality model”. To simulate the economic process in the time domain, an innovative businesses recovery function has been proposed. The proposed method has been applied to generate scenarios earthquakes at the location of bridges and business areas. The proposed selection method- ology has been applied to reduce 8000 scenarios to a subset of 60. Subse- quently, these scenario earthquakes have been used to calculate three system performance parameters: the risk in transportation networks, the risk in terms of business damage and the losses of gross regional product. A novel model for business recovery process has been tested. The proposed model has been used to represent the business recovery process and simulate the effects of government aids allocated for reconstruction. The proposed method has efficiently modeled the seismic hazard using scenario earthquakes. The scenario earthquakes presented have been used to assess possible consequences of earthquakes in seismic prone zones and to increase the preparedness. Scenario earthquakes have been used to sim- ulate the effects to economy of the impacted area; a significant Gross Regional Product reduction has been shown, up to 77% with an earthquake with 0.0003 probability of occurrence. The results showed that limited funds available after the disaster can be distributed in a more efficient way

    Using a systems approach to analyze the operational safety of dams

    Get PDF
    Dam systems are arrangements of interacting components that store and convey water for beneficial purposes. Dam failures are associated with extreme consequences to human life, the environment and the economy. Existing techniques for dam safety analysis tend to focus on verifying system performance at the edge of the design envelope. In analyzing the events which occur within the design envelope, linear chain-of-events models are often used to analyze the potential outcomes for the system. These chain-of-events models require that combinations of conditions are identified at the outset of the analysis, which can be very cumbersome given the number of physically possible combinations. Additional complications arising from feedback behaviour and time are not easily overcome using existing tools. Recent work in the industry has begun to focus on systems approaches to the problem, especially stochastic simulation. Given current computational abilities, stochastic simulation may not be capable of analyzing combinations of events that have a low combined probability but potentially extreme consequences. This research focuses on developing and implementing a methodology that dynamically characterizes combinations of component operating states and their potential impacts on dam safety. Automated generation of scenarios is achieved through the use of a component operating states database that defines all possible combinations of component states (scenarios) using combinatorics. A Deterministic Monte Carlo simulation framework systematically characterizes each scenario through a number of iterations that vary adverse operating state timing, impacts and inflows. Component interactions and feedbacks are represented within the system dynamics simulation model. Simulation outcomes provide useful indicators for dam operators including conditional failure rates, times to failure, failure inflow thresholds, and reservoir level exceedance frequencies. Dynamic system response can be assessed directly from the simulation outcomes. The scenario results may be useful to dam owners in emergency decision-making to inform response timelines and to justify the allocation of resources. Results may also help inform the development of improved operating strategies or upgrade alternatives that can reduce the impacts of these extreme events. This work offers a significant improvement in the ability to systematically characterize the potential combinations of events and their consequences

    3rd Probabilistic Workshop Technical Systems, Natural Hazards

    Get PDF
    Modern engineering structures should ensure an economic design, construction and operation of structures in compliance with the required safety for persons and the environment. In order to achieve this aim, all contingencies and associated consequences that may possibly occur throughout the life cycle of the considered structure have to be taken into account. Today, the development is often based on decision theory, methods of structural reliability and the modeling of consequences. Failure consequences are one of the significant issues that determine optimal structural reliability. In particular, consequences associated with the failure of structures are of interest, as they may lead to significant indirect consequences, also called follow-up consequences. However, apart from determining safety levels based on failure consequences, it is also crucially important to have effective models for stress forces and maintenance planning ... (aus dem Vorwort

    6th International Probabilistic Workshop - 32. Darmstädter Massivbauseminar: 26-27 November 2008 ; Darmstadt, Germany 2008 ; Technische Universität Darmstadt

    Get PDF
    These are the proceedings of the 6th International Probabilistic Workshop, formerly known as Dresden Probabilistic Symposium or International Probabilistic Symposium. The workshop was held twice in Dresden, then it moved to Vienna, Berlin, Ghent and finally to Darmstadt in 2008. All of the conference cities feature some specialities. However, Darmstadt features a very special property: The element number 110 was named Darmstadtium after Darmstadt: There are only very few cities worldwide after which a chemical element is named. The high element number 110 of Darmstadtium indicates, that much research is still required and carried out. This is also true for the issue of probabilistic safety concepts in engineering. Although the history of probabilistic safety concepts can be traced back nearly 90 years, for the practical applications a long way to go still remains. This is not a disadvantage. Just as research chemists strive to discover new element properties, with the application of new probabilistic techniques we may advance the properties of structures substantially. (Auszug aus Vorwort

    A model for assessing water quality risk in catchments prone to wildfire

    Get PDF
    Post-fire debris flows can have erosion rates up to three orders of magnitude higher than background rates. They are major sources of fine suspended sediment, which is critical to the safety of water supply from forested catchments. Fire can cover parts or all of these large catchments and burn severity is often heterogeneous. The probability of spatial and temporal overlap of fire disturbance and rainfall events, and the susceptibility of hillslopes to severe erosion determine the risk to water quality. Here we present a model to calculate recurrence intervals of high magnitude sediment delivery from runoff-generated debris flows to a reservoir in a large catchment (>100 km2) accounting for heterogeneous burn conditions. Debris flow initiation was modelled with indicators of surface runoff and soil surface erodibility. Debris flow volume was calculated with an empirical model, and fine sediment delivery was calculated using simple, expert-based assumptions. In a Monte-Carlo simulation, wildfire was modelled with a fire spread model using historic data on weather and ignition probabilities for a forested catchment in central Victoria, Australia. Multiple high intensity storms covering the study catchment were simulated using Intensity–Frequency–Duration relationships, and the runoff indicator calculated with a runoff model for hillslopes. A sensitivity analysis showed that fine sediment is most sensitive to variables related to the texture of the source material, debris flow volume estimation, and the proportion of fine sediment transported to the reservoir. As a measure of indirect validation, denudation rates of 4.6–28.5 mm ka−1 were estimated and compared well to other studies in the region. From the results it was extrapolated that in the absence of fire management intervention the critical sediment concentrations in the studied reservoir could be exceeded in intervals of 18–124 years

    NESC Peer-Review of the Flight Rationale for Expected Debris Report

    Get PDF
    Since the loss of Columbia on February 1, 2003, the Space Shuttle Program (SSP) has significantly improved the understanding of launch and ascent debris, implemented hardware modifications to reduce debris, and conducted tests and analyses to understand the risks associated with expected debris. The STS-114 flight rationale for expected debris relies on a combination of all three of these factors. A number of design improvements have been implemented to reduce debris at the source. The External Tank (ET) thermal protection system (TPS) foam has been redesigned and/or process improvements have been implemented in the following locations: the bipod closeout, the first ten feet of the liquid hydrogen (LH2) tank protuberance air load (PAL) ramp, and the LH2 tank-to-intertank flange closeout. In addition, the forward bipod ramp has been eliminated and heaters have been installed on the bipod fittings and the liquid oxygen (LO2) feedline forward bellows to prevent ice formation. The Solid Rocket Booster (SRB) bolt catcher has been redesigned. The Orbiter reaction control system (RCS) thruster cover "butcher paper" has been replaced with a material that sheds at a low velocity. Finally, the pad area has been cleaned to reduce debris during lift-off

    Probabilistic Performance-Based Hurricane Engineering (PBHE) Framework

    Get PDF
    In modern times, hurricanes have caused enormous losses to the communities worldwide both in terms of property damage and loss of life. In light of these losses, a comprehensive methodology is required to improve the quantification of risk and the design of structures subject to hurricane hazard. This research develops a probabilistic Performance-Based Hurricane Engineering (PBHE) framework for hurricane risk assessment. The proposed PBHE is based on the total probability theorem, similar to the Performance-Based Earthquake Engineering (PBEE) framework developed by the Pacific Earthquake Engineering Research (PEER) Center, and to the Performance-Based Wind Engineering (PBWE) framework. The methodology presented in this research disaggregates the risk assessment analysis into independent elementary components, namely hazard analysis, structural characterization, interaction analysis, structural analysis, damage analysis, and loss analysis. It also accounts for the multi-hazard nature of hurricane events by including the separate effects of, as well as the interaction among, hurricane wind, flood, windborne debris, and rainfall hazards. This research uses the Performance-Based Hurricane Engineering (PBHE) framework with multi-layer Monte Carlo Simulation (MCS) for the loss analysis of structures subject to hurricane hazard. The interaction of different hazard sources is integrated into the framework and their effect on the risk assessment of non-engineered structures, such as low-rise residential buildings, is investigated. The performance of popular storm mitigation techniques and design alternatives for residential buildings are also compared from a cost-benefit perspective. Finally, the PBHE framework is used for risk assessment of engineered structures, such as tall buildings. The PBHE approach introduced in this study represents a first step toward a rational methodology for risk assessment and design of structures subjected to multi-hazard scenarios
    corecore