110 research outputs found

    HAZARD CONSISTENT SEISMIC PERFORMANCE ASSESSMENT OF ROCKFILL DAMS

    Get PDF
    Slope systems such as earth/rockfill dams, waste storage landfill, or natural slopes can undergo significant damage during a seismic event. In the seismic design of slope systems, engineers often consider the amount of seismically-induced displacements as the key engineering demand parameter. The current state of practice procedures to estimate seismically-induced slope displacements are dominated by deterministic or pseudo-probabilistic approaches that do not directly quantify the hazard associated with the estimated displacements. Instead, these approaches assume that the hazard of the ground motion intensity measure of interest (e.g., peak ground acceleration) also represents the hazard of the estimated displacements. In contrast, performance-based approaches, which are the focus of this study, can provide hazard curves for the engineering demand parameter of interest (i.e., the amount of seismically-induced displacements in the context of this study), from which the estimated displacements can be directly related to the hazard design level. In this study, we propose to combine the conditional scenario spectra approach with advanced numerical modeling as a benchmark to evaluate performance-based approaches that rely on simplified and analytical procedures for the estimation of seismically-induced displacements in rockfill dams. The evaluations show that the displacement hazard curves obtained through computationally intensive numerical analyses (performed with three different constitutive models) are more conservative than the hazard curves from simplified or analytical methods. Insights from the comparisons are shared, and potential explanations for the differences are provided. Finally, there are also differences in the displacement hazard curves estimated through numerical analyses, which depend on the trade-off of volumetric/deviatoric mechanisms and damping in each constitutive model.M.S

    Προηγμένες υπολογιστικές μέθοδοι αντισεισμικού σχεδιασμού και αποτίμησης κατασκευών από οπλισμένο σκυρόδεμα

    Get PDF
    309 σ.Ο κύριος στόχος της διατριβής ήταν η ανάπτυξη ενός ολοκληρωμένου πλαισίου για την αξιολόγηση και τον οικονομικό-ασφαλή αντισεισμικό σχεδιασμό κατασκευών. Αυτός ο καθολικός στόχος της διατριβής επετεύχθη μέσω των ακόλουθων βημάτων: (i) Στο πρώτο μέρος της διατριβής πραγματοποιήθηκε αριθμητική βαθμονόμηση ορισμένων από τους πιο δημοφιλείς δείκτες βλάβης που έχουν προταθεί και υιοθετηθεί από πολλούς ερευνητές προκειμένου να προσδιορίσουν το επίπεδο ζημίας κατασκευών από οπλισμένο σκυρόδεμα. (ii) Στο δεύτερο στάδιο της διατριβής πραγματοποιήθηκε αξιολόγηση των περιγραφικών διαδικασιών αντισεισμικού σχεδιασμού σε σχέση με τον συντελεστή συμπεριφοράς που υιοθετείται από τους Ευρωκώδικες και διερευνήθηκε η βέλτιστη επιλογή που οδηγεί στον οικονομικότερο και ασφαλέστερο σχεδιασμό. Επιπροσθέτως έγινε σύγκριση των μεθόδων σχεδιασμού με βάση την επίδοση σε σχέση με τις περιγραφικές μεθόδους σχεδιασμού. Στη συνέχεια, με βάση τους βαθμονομημένους δείκτες βλάβης, διατυπώθηκαν προβλήματα βελτιστοποίησης με στόχο να προσδιοριστεί ο δείκτης βλάβης ή ο συνδυασμός δεικτών βλάβης που αποτελούν την σωστότερη επιλογή προκειμένου να ενσωματωθεί στο πλαίσιο σχεδιασμού που βασίζεται στην επιτελεστικότητα. (iii) Ως επόμενο βήμα της διατριβής ήταν η βελτίωση της διαδικασίας του κόστους κύκλου ζωής όσον αφορά την αξιοπιστία και την υπολογιστική αποδοτικότητά της.. (iv) Ο τελευταίος στόχος της διατριβής ήταν η βελτίωση της διαδικασίας ανάλυσης τρωτότητας σε σχέση με την αξιοπιστία και την υπολογιστική αποδοτικότητα. Η υπολογιστική αποδοτικότητα επετεύχθη μέσω της προτεινόμενης στο πλαίσιο της παρούσης διατριβής προσαυξητικής δυναμικής ανάλυσης με βάση πρόβλεψης νευρωνικών δικτύων η οποία μειώνει τις υπολογιστικές απαιτήσεις κατά μία τάξη μεγέθους.The major objective of this Dissertation is to develop an integrated framework for the economical and safe antiseismic design and assessment of new reinforced concrete structures by means of life-cycle cost and fragility analysis. This objective of the dissertation is achieved through the accomplishment of the following tasks: (i) At the first part of the Dissertation numerical calibration for some of the most popular damage indices (DIs) that have been proposed by many researchers was performed, in order to quantify the extent of damage in reinforced concrete structures. (ii) A critical assessment of prescriptive design procedures was performed with reference to their ability to lead to safe and economical designs. Furthermore, a comparison between prescriptive and performance-based seismic design procedures was carried out. For this purpose a number of structural seismic design optimisation problems have been formulated. On the other hand, based on the calibrated DIs, structural optimization problems were formulated aiming at identifying the DI, or the combination of Dis that will provide reliable information on damage so that they can be incorporated into a Performance-Based Design framework. The ultimate objective of this task is to compare lower-bound designs that satisfy the design code requirements in the most cost-effective way using a Life-Cycle Cost Analysis (LCCA) methodology. (iii) The next step is to improve the LCCA procedure with reference to both its robustness and efficiency. (iv) The last objective of the dissertation is to improve the fragility analysis procedure with reference to both robustness and efficiency. The efficiency is achieved by introducing a neural network-based incremental dynamic analysis (IDA) procedure that reduces the computational effort by one order of magnitude.Χαρίκλεια Χ. Μητροπούλο

    Probability distribution theory, generalisations and applications of l-moments

    Get PDF
    In this thesis, we have studied L-moments and trimmed L-moments (TL-moments) which are both linear functions of order statistics. We have derived expressions for exact variances and covariances of sample L-moments and of sample TL-moments for any sample size n in terms of first and second-order moments of order statistics from small conceptual sample sizes, which do not depend on the actual sample size n. Moreover, we have established a theorem which characterises the normal distribution in terms of these second-order moments and the characterisation suggests a new test of normality. We have also derived a method of estimation based on TL-moments which gives zero weight to extreme observations. TL-moments have certain advantages over L-moments and method of moments. They exist whether or not the mean exists (for example the Cauchy distribution) and they are more robust to the presence of outliers. Also, we have investigated four methods for estimating the parameters of a symmetric lambda distribution: maximum likelihood method in the case of one parameter and L-moments, LQ-moments and TL-moments in the case of three parameters. The L-moments and TL-moments estimators are in closed form and simple to use, while numerical methods are required for the other two methods, maximum likelihood and LQ-moments. Because of the flexibility and the simplicity of the lambda distribution, it is useful in fitting data when, as is often the case, the underlying distribution is unknown. Also, we have studied the symmetric plotting position for quantile plot assuming a symmetric lambda distribution and conclude that the choice of the plotting position parameter depends upon the shape of the distribution. Finally, we propose exponentially weighted moving average (EWMA) control charts to monitor the process mean and dispersion using the sample L-mean and sample L-scale and charts based on trimmed versions of the same statistics. The proposed control charts limits are less influenced by extreme observations than classical EWMA control charts, and lead to tighter limits in the presence of out-of-control observations

    Probabilistic Assessment of Existing Masonry Structures – The Influence of Spatially Variable Material Properties and a Bayesian Method for Determining Structure-Specific Partial Factors

    Get PDF
    For the assessment of existing masonry structures, a safety concept is required that takes into account the differences compared to the design of new masonry structures, such as the possibility of material testing, high variability of material properties, and a potentially reduced target reliability level. Therefore, a method for determining characteristic values, structure-specific partial factors, and assessment values for the compressive strength of existing masonry is developed. For this purpose, the influence of spatially variable material properties within a masonry wall is investigated first: Based on experiments on clay brick masonry, a finite element model is developed and then used in Monte Carlo simulations for quantifying the effect of spatial material variability on the probability distribution of the load-bearing capacity of masonry walls under compression. The statistical uncertainty resulting from small sample sizes in material testing is considered through Bayesian statistical procedures. Prior distributions for unit, mortar, and masonry compressive strength are modelled utilising a test database for existing solid clay brick masonry. Finally, the findings are implemented in a practice-oriented method for determining assessment values of masonry compressive strength, which is validated through reliability analyses

    Statistical issues in the assessment of undiscovered oil and gas resources

    Get PDF
    Prior to his untimely death, my friend Dave Wood gave me wise counsel about how best to organize a paper describing uses of statistics in oil and gas exploration. A preliminary reconnaissance of the literature alerted me to the enormous range of topics that might be covered. Geology, geophysics with particular attention to seismology, geochemistry, petroleum engineering and petroleum economics--each of these disciplines plays an important role in petroleum exploration and each weaves statistical thinking into its fabric in a distinctive way. An exhaustive review would be book length. Dave and I agreed that a timely review paper of reasonable length would: (1) Illustrate the range of statistical thinking of oil and gas exploratists. (2) Concentrate on topics with statistical novelty, show how statistical thinking can lead to better decision making and let the reader now about important controversies that might be resolved by better use of statistical methods. (3) Focus on topics that are directly relevant to exploration decision making and resource estimation. In response to Dave's sensible suggestions, the Department of Interior's 1989 assessment of U.S. undiscovered oil and gas will be a tour map for a short trip through a large territory of statistical methods and applications. Were he here to review this review, I know that it would be better than it is.Supported in part by the MIT Center for Energy and Environmental Policy Research

    Probabilistic fatigue methodology and wind turbine reliability

    Full text link
    corecore