273 research outputs found
Aging concrete structures: a review of mechanics and concepts
The safe and cost-efficient management of our built infrastructure is a challenging task considering the expected service life of at least 50 years. In spite of time-dependent changes in material properties, deterioration processes and changing demand by society, the structures need to satisfy many technical requirements related to serviceability, durability, sustainability and bearing capacity. This review paper summarizes the challenges associated with the safe design and maintenance of aging concrete structures and gives an overview of some concepts and approaches that are being developed to address these challenges
Recommended from our members
Reliability-based Serviceability Limit State Design Procedures for Shallow Foundations
Reliability-based geotechnical foundation design focusses on soil and structure analysis that meets necessary safety, performance and/or serviceability criteria, calibrated based on probabilistic analyses and an accepted level of risk. As the civil engineering community seeks to better harmonize geotechnical and structural design methodologies, reliability-based design is being incorporated more into geotechnical limit states analysis, for example ultimate limit state (ULS; e.g., bearing capacity) and serviceability limit state (SLS; e.g., settlement) foundation design. However, additional work is required to develop robust design procedures that can be easily implemented in practice.
The main objective of this study is to advance the underlying knowledge of reliability-based serviceability limit state (RBSLS) design for shallow foundations. Particularly, this study focusses on foundations supported on plastic, fine-grained soil (e.g., clay) and aggregate pier-improved plastic, fine-grained soil.
As part of this work, two new RBSLS models were developed for footings supported on clay and aggregate pier-reinforced clay. Both SLS models were developed and calibrated using probabilistic analyses and databases compiled from the literature for high-quality footing loading tests with immediate (undrained) settlement. The new models capture the nonlinear bearing pressure-displacement behavior that is typical of footings on plastic, fine-grained soils, even at relatively small (e.g., service-level) loads. A calibrated, lumped load and resistance factor is also introduced for both models that can be easily implemented in conjunction with a pre-selected level of risk and/or reliability index.
The study continues with further discussion of calibration procedures, with focus on the impact of the correlation structure of individual load-displacement parameters and suitable factors to account for the propagation of model error and other sources of uncertainty. This phase of work also focused on re-evaluating the calibrated SLS model for footings supported on aggregate pier-reinforced clay and providing an independent evaluation using additional full-scale loading tests completed at the Oregon State University geotechnical engineering field research site (OSU GEFRS).
Finally, the calibrated load-displacement model for a footing supported on fine-grained soil was used to develop a non-linear soil-spring model within the computer program OpenSees. The foundation spring model was used in combination with a previously developed 3-story steel moment frame building model to complete a series of Monte Carlo simulations with varying levels of soil variability to investigate the role of both inherent soil variability and soil-structure interaction on foundation and structural performance
Untangling hotel industry’s inefficiency: An SFA approach applied to a renowned Portuguese hotel chain
The present paper explores the technical efficiency of four hotels from Teixeira Duarte Group - a renowned Portuguese hotel chain. An efficiency ranking is established from these four hotel units located in Portugal using Stochastic Frontier Analysis. This methodology allows to discriminate between measurement error and systematic inefficiencies in the estimation process enabling to investigate the main inefficiency causes. Several suggestions concerning efficiency improvement are undertaken for each hotel studied.info:eu-repo/semantics/publishedVersio
Estimating Dependences and Risk between Gold Prices and S&P500: New Evidences from ARCH,GARCH, Copula and ES-VaR models
This thesis examines the correlations and linkages between the stock and commodity in order to quantify the risk present for investors in financial market (stock and commodity) using the
Value at Risk measure. The risk assessed in this thesis is losses on investments in stock (S&P500) and commodity (gold prices). The structure of this thesis is based on three empirical chapters. We emphasise the focus by acknowledging the risk factor which is the non-stop fluctuation in the prices of commodity and stock prices. The thesis starts by measuring volatility, then dependence which is the correlation and lastly measure the expected shortfalls and Value at risk (VaR). The research focuses on mitigating the risk using VaR measures and assessing the use of the volatility measures such as ARCH and GARCH and basic VaR calculations, we also measured the correlation using the Copula method. Since, the measures of volatility methods have limitations that they can measure single security at a time, the second empirical chapter measures the interdependence of stock and commodity (S&P500 and Gold Price Index) by investigating the risk transmission involved in investing in any of them and whether the ups and downs in the prices of one effect the prices of the other using the Time Varying copula method. Lastly, the third empirical chapter which is the last chapter, investigates the expected shortfalls and Value at Risk (VaR) between the S&P500 and Gold prices Index using the ES-VaR method proposed by Patton, Ziegel and Chen (2018). Volatility is considered to be the most popular and traditional measure of risk. For which we have used ARCH and GARCH model in our first empirical chapter. However, the problem with volatility is that it does not take into account the direction of an investments’ movement: volatility of stocks is that they suddenly jump higher and investors are not distressed with gains. When we talk about investors for them the risk is about the odds of losing money, after my research and findings VaR is based on the common-sense fact. Hence, investors care about the odds of big losses, VaR answers the question, what is my worst-case scenario? Or simply how much I could lose in a really bad month? The results of the thesis demonstrated that measuring volatility (ARCH GARCH) alone was not sufficient in measuring the risk involved in an investment therefore
methodologies such as correlation and VAR demonstrates better results. In terms of measuring the interdependence, the Time Varying Copula is used since the dynamic structure of the de-
pendence between the data can be modelled by allowing either the copula function or the dependence parameter to be time varying. Lastly, hybrid model further demonstrates the average return on a risky asset for which Expected Shortfall (ES) along with some quantile dependence and VaR (Value at risk) is utilised. Basel III Accord which is applied in coming years till 2019 focuses more on ES unlike VaR, hence there is little existing work on modelling ES. The thesis focused on the results from the model of Patton, Ziegel and Chen (2018) which is based on the statistical decision theory. Patton, Ziegel and Chen (2018), overcame the problem of elicitability for ES by using ES and VaR jointly and propose the new dynamic model of risk measure. This research adds to the contribution of knowledge that measuring risk by using volatility is not enough for measuring risk, interdependence helps in measuring the dependency of one variable over the other and estimations and inference methods proposed by Patton, Ziegel and Chen (2018) using simulations proposed in ES-VaR model further concludes that ARCH and GARCH or other rolling window models are not enough for determining the risk forecasts. The results suggest, in first empirical chapter we see volatility between Gold prices and S&P500. The second empirical chapter results suggest conditional dependence of the two indexes is strongly time varying. The correlation between the stock is high before 2008. The results further displayed slight stronger bivariate upper tail, which signifies that the conditional dependence of the indexes is influence by positive shocks. The last empirical chapter findings
proposed that measuring forecasts using ES-Var model proposed by Patton, Ziegel and Chen (2018) does outer perform forecasts based on univariate GARCH model. Investors want to 10
protect themselves from high losses and ES-VaR model discussed in last chapter would certainly help them to manage their funds properly
Statistical Degradation Models for Electronics
With increasing presence of electronics in modern systems and in every-day products, their reliability is inextricably dependent on that of their electronics. We develop reliability models for failure-time prediction under small failure-time samples and information on individual degradation history. The development of the model extends the work of Whitmore et al. 1998, to incorporate two new data-structures common to reliability testing. Reliability models traditionally use lifetime information to evaluate the reliability of a device or system. To analyze small failure-time samples within dynamic environments where failure mechanisms are unknown, there is a need for models that make use of auxiliary reliability information. In this thesis we present models suitable for reliability data, where degradation variables are latent and can be tracked by related observable variables we call markers.
We provide an engineering justification for our model and develop parametric and predictive inference equations for a data-structure that includes terminal observations of the degradation variable and longitudinal marker measurements. We compare maximum likelihood estimation and prediction results obtained by Whitmore et. al. 1998 and show improvement in inference under small sample sizes. We introduce modeling of variable failure thresholds within the framework of bivariate degradation models and discuss ways of incorporating covariates.
In the second part of the thesis we investigate anomaly detection through a Bayesian support vector machine and discuss its place in degradation modeling. We compute posterior class probabilities for time-indexed covariate observations, which we use as measures of degradation. Lastly, we present a multistate model used to model a recurrent event process and failure-times. We compute the expected time to failure using counting process theory and investigate the effect of the event process on the expected failure-time estimates
Uncertainty Quantification of the Responses of Transient Electromagnetic Disturbance Coupling to Transmission Lines Based on Stochastic Models
L'abstract è presente nell'allegato / the abstract is in the attachmen
Sustainable Production in Food and Agriculture Engineering
This book is a collection of original research and review papers that report on the state of the art and recent advancements in food and agriculture engineering, such as sustainable production and food technology. Encompassed within are applications in food and agriculture engineering, biosystem engineering, plant and animal production engineering, food and agricultural processing engineering, storing industry, economics and production management and agricultural farms management, agricultural machines and devices, and IT for agricultural engineering and ergonomics in agriculture
- …