27 research outputs found
Static Fatigue: A Key Cause of Time Effects in Sand.
Freshly deposited or disturbed sands have a tendency to alter their mechanical properties. This is often seen after dynamic compaction of sand beds, where at first, the cone penetration resistance may not have increased, but in a matter of weeks and months the increase in resistance to penetration becomes evident. This time effect has been known for decades, but there is no universally accepted explanation as to what causes such time-delayed effects in sands. A hypothesis is postulated, which suggests that a key contributing factor behind time effects in sands is the delayed fracturing of the micro-morphological features present on surfaces of sand grains in contact. Experimental tests performed as part of this research support this hypothesis, and the numerical simulations indicate that the hypothesis is plausible. Experiments were performed using two custom-designed laboratory devices for testing of inter-granular sand contacts, and also tests were carried out on sand specimens, dry and wet. All experimental results are consistent with the hypothesis suggested. In addition, discrete element and finite element computations were performed as part of this study on aging of sand. Discrete element simulations were able to mimic the characteristics of the true (experimental) process when the static fatigue hypothesis was used. The model also predicted an increase in horizontal stress in sand subjected to sustained vertical load with restrained horizontal deformation, which is central to explaining the process of time-delayed increase in cone penetration resistance of sands after disturbance. Aging of sands under confined conditions leads to an increase in inter-granular contact stiffness, and thereby, to an increase in macroscopic elastic moduli, but the internal friction angle of sand remains nearly unchanged during aging.PHDCivil EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/102292/1/siddu_1.pd
Формирование эмоциональной культуры как компонента инновационной культуры студентов
Homozygosity has long been associated with rare, often devastating, Mendelian disorders1 and Darwin was one of the first to recognise that inbreeding reduces evolutionary fitness2. However, the effect of the more distant parental relatedness common in modern human populations is less well understood. Genomic data now allow us to investigate the effects of homozygosity on traits of public health importance by observing contiguous homozygous segments (runs of homozygosity, ROH), which are inferred to be homozygous along their complete length. Given the low levels of genome-wide homozygosity prevalent in most human populations, information is required on very large numbers of people to provide sufficient power3,4. Here we use ROH to study 16 health-related quantitative traits in 354,224 individuals from 102 cohorts and find statistically significant associations between summed runs of homozygosity (SROH) and four complex traits: height, forced expiratory lung volume in 1 second (FEV1), general cognitive ability (g) and educational attainment (nominal p<1 × 10−300, 2.1 × 10−6, 2.5 × 10−10, 1.8 × 10−10). In each case increased homozygosity was associated with decreased trait value, equivalent to the offspring of first cousins being 1.2 cm shorter and having 10 months less education. Similar effect sizes were found across four continental groups and populations with different degrees of genome-wide homozygosity, providing convincing evidence for the first time that homozygosity, rather than confounding, directly contributes to phenotypic variance. Contrary to earlier reports in substantially smaller samples5,6, no evidence was seen of an influence of genome-wide homozygosity on blood pressure and low density lipoprotein (LDL) cholesterol, or ten other cardio-metabolic traits. Since directional dominance is predicted for traits under directional evolutionary selection7, this study provides evidence that increased stature and cognitive function have been positively selected in human evolution, whereas many important risk factors for late-onset complex diseases may not have been
Development of Risk Prediction Equations for Incident Chronic Kidney Disease
IMPORTANCE ‐ Early identification of individuals at elevated risk of developing chronic kidney disease
could improve clinical care through enhanced surveillance and better management of underlying health
conditions.
OBJECTIVE – To develop assessment tools to identify individuals at increased risk of chronic kidney
disease, defined by reduced estimated glomerular filtration rate (eGFR).
DESIGN, SETTING, AND PARTICIPANTS – Individual level data analysis of 34 multinational cohorts from
the CKD Prognosis Consortium including 5,222,711 individuals from 28 countries. Data were collected from April, 1970 through January, 2017. A two‐stage analysis was performed, with each study first
analyzed individually and summarized overall using a weighted average. Since clinical variables were often differentially available by diabetes status, models were developed separately within participants
with diabetes and without diabetes. Discrimination and calibration were also tested in 9 external
cohorts (N=2,253,540).
EXPOSURE Demographic and clinical factors.
MAIN OUTCOMES AND MEASURES – Incident eGFR <60 ml/min/1.73 m2.
RESULTS – In 4,441,084 participants without diabetes (mean age, 54 years, 38% female), there were
660,856 incident cases of reduced eGFR during a mean follow‐up of 4.2 years. In 781,627 participants
with diabetes (mean age, 62 years, 13% female), there were 313,646 incident cases during a mean
follow‐up of 3.9 years. Equations for the 5‐year risk of reduced eGFR included age, sex, ethnicity, eGFR,
history of cardiovascular disease, ever smoker, hypertension, BMI, and albuminuria. For participants
with diabetes, the models also included diabetes medications, hemoglobin A1c, and the interaction
between the two. The risk equations had a median C statistic for the 5‐year predicted probability of
0.845 (25th – 75th percentile, 0.789‐0.890) in the cohorts without diabetes and 0.801 (25th – 75th
percentile, 0.750‐0.819) in the cohorts with diabetes. Calibration analysis showed that 9 out of 13 (69%)
study populations had a slope of observed to predicted risk between 0.80 and 1.25. Discrimination was
similar in 18 study populations in 9 external validation cohorts; calibration showed that 16 out of 18
(89%) had a slope of observed to predicted risk between 0.80 and 1.25.
CONCLUSIONS AND RELEVANCE – Equations for predicting risk of incident chronic kidney disease
developed in over 5 million people from 34 multinational cohorts demonstrated high discrimination and
variable calibration in diverse populations
Serum potassium and adverse outcomes across the range of kidney function: a CKD Prognosis Consortium meta-analysis.
Aims: Both hypo- and hyperkalaemia can have immediate deleterious physiological effects, and less is known about long-term risks. The objective was to determine the risks of all-cause mortality, cardiovascular mortality, and end-stage renal disease associated with potassium levels across the range of kidney function and evaluate for consistency across cohorts in a global consortium. Methods and results: We performed an individual-level data meta-analysis of 27 international cohorts [10 general population, 7 high cardiovascular risk, and 10 chronic kidney disease (CKD)] in the CKD Prognosis Consortium. We used Cox regression followed by random-effects meta-analysis to assess the relationship between baseline potassium and adverse outcomes, adjusted for demographic and clinical characteristics, overall and across strata of estimated glomerular filtration rate (eGFR) and albuminuria. We included 1 217 986 participants followed up for a mean of 6.9 years. The average age was 55 ± 16 years, average eGFR was 83 ± 23 mL/min/1.73 m2, and 17% had moderate- to-severe increased albuminuria levels. The mean baseline potassium was 4.2 ± 0.4 mmol/L. The risk of serum potassium of >5.5 mmol/L was related to lower eGFR and higher albuminuria. The risk relationship between potassium levels and adverse outcomes was U-shaped, with the lowest risk at serum potassium of 4-4.5 mmol/L. Compared with a reference of 4.2 mmol/L, the adjusted hazard ratio for all-cause mortality was 1.22 [95% confidence interval (CI) 1.15-1.29] at 5.5 mmol/L and 1.49 (95% CI 1.26-1.76) at 3.0 mmol/L. Risks were similar by eGFR, albuminuria, renin-angiotensin-aldosterone system inhibitor use, and across cohorts. Conclusions: Outpatient potassium levels both above and below the normal range are consistently associated with adverse outcomes, with similar risk relationships across eGFR and albuminuria
Assessment of the behavior of buried concrete pipelines subjected to ground rupture: Experimental study
Rapid assessment of damage to buried pipelines from earthquake-induced ground deformation is a crucial component to recovery efforts. This paper reports on the first year of a four-year study aimed at developing rapid, reliable, and cost-effective sensing systems for health monitoring and damage detection for buried concrete pipelines subjected to ground deformation. A custom-designed sensing strategy was implemented in a ground rupture experiment with a scaled-down concrete pipeline. The behavior of the pipeline, including the failure modes and damage inflicted to the pipe segments, was monitored during the test. Two modes of failure were identified in the test: (1) compression associated with telescoping-type deformation and (2) bending at the pipeline joints closest to the fault plane. Consequently, future research toward advancing sensing technology for concrete pipelines will likely focus on the behavior of the joints. © 2012 American Society of Civil Engineers
A simple soil-structure interaction model for indirect damage assessment of segmented concrete pipelines during PGD
This paper describes a simple soil-structure interaction model of a buried segmented concrete pipeline that can be used for indirect health monitoring during Permanent Ground Deformation (PGD). Buried pipelines are difficult to inspect visually, and thus accurate health monitoring systems can improve the efficiency and effectiveness of repair efforts immediately following an earthquake. A Winkler pipeline model is developed for indirect health monitoring that incorporates the two primary modes of failure observed in pipeline experiments, namely telescoping and rotation at the joints. Very approximate estimates of the model parameters are made, and the model results are compared to experimental results. In general the model captures both the magnitude and patterns of joint deformation. However, the model yields axial forces that are two orders of magnitude higher than the measured values. This suggests that the first order approximation of the joint as an elastic beam is inaccurate. Structural testing of the joints both in axial compression and rotation will provide more accurate refinement of the joint model. © 2011 ASCE
Underground Sensing Strategies for the Health Assessment of Buried Pipelines
Buried lifeline infrastructure including pipelines, tunnels, power and communication lines, among others, are vital to ensuring the operation of the national economy. Permanent ground displacement (PGD) from earthquakes and landslides is the most serious hazard to buried pipelines, prompting often slow and expensive methods of damage localization before repairs can be made. Due to the importance of these buried lifelines, it is critical that low-cost and rapid methodologies for damage detection and localization be developed. Monitoring systems embedded in and around the pipeline are an obvious approach but typically suffer from the cost and obtrusiveness of long cable requirements. The primary goal of this chapter is to illustrate novel sensing methods that can serve as the basis for monitoring buried pipelines exposed to PGD. In particular, the chapter focuses on the monitoring of segmented concrete pipelines, which typically experience damage at their joints due to PGD. Wireless telemetry is evaluated to validate wireless sensors for buried applications, thus reducing greatly the cost of dense sensor systems in regions of high PGD risk. An overview of current buried pipeline sensing technology is made and three experimental full-scale PGD tests are conducted to evaluate pipeline motion and damage detection methodologies in segmented concrete pipelines. Real-time monitoring of joint rotations and translations by potentiometers as well as direct damage measures of joint regions by acoustic emission and conductive surface sensors were made. Strain gages were used to successfully portray global load transfer throughout the pipeline, validated by load cell measurements at the pipe ends. The combined sensor information is successfully used to create a hypothesis for the damage evolution process of buried segmented concrete pipelines under PGD and to validate the use of wireless sensors for buried pipeline monitoring
Using electrical, magnetic and acoustic sensors to detect damage in segmental concrete pipes subjected to permanent ground displacement
This paper describes results of an experimental study that used sensing methods for monitoring damage along segmental concrete pipelines resulting from permanent ground displacement across a simulated earthquake fault. The literature contains examples of such damage occurring during actual earthquakes, significantly impacting the functionality of the pipelines. Detecting the location of the damage and the extent of the damage in pipelines can significantly accelerate post-earthquake repair efforts. In this paper, electrical sensing methods, magnetic sensing, and acoustic emission are used to monitor structural damage in a segmental concrete pipeline during a large-scale test. In this test, the segmental concrete pipeline was subjected to a concentrated transverse permanent ground displacements (PGDs). The majority of the damage to the pipe segments was localized at the joints, especially the bell sections while the damage to the spigots was minimal. The damage extended away from the joints in the pipe segments in the immediate vicinity of the fault line. Telescoping (i.e., crushing of the bell-and-spigot) was a primary mode of failure that was observed. The results of this study indicate that electrical sensing methods (including the use of conductive grout), magnetic sensing, and acoustic emission, employed alone or in combination, can detect and quantify the damage in segmental concrete pipelines. © 2011 Elsevier Ltd. All rights reserved
Performance and damage evolution of plain and fibre-reinforced segmental concrete pipelines subjected to transverse permanent ground displacement
This paper presents the results of three full-scale experiments performed on segmental concrete pipelines subjected to permanent ground displacement. The first pipeline was made of reinforced concrete pipes and the second pipeline was made of steel fibre-reinforced concrete pipes. The third pipeline was made of a combination of fibre-reinforced and reinforced concrete pipes. An array of sensing techniques was used to assess the damage evolution in pipelines and their overall performance. Three stages of damage were observed. In the first stage, damage was concentrated in the joints near the fault line. In the second stage, the damage occurred in all joints along the pipeline. While in the first two stages damage was mainly concentrated at the bell and spigot joints of the pipe segments, the third stage of damage was characterised by severe damage and rupture of the body of pipe segments located in the immediate vicinity of the fault line. The modes of failure for the plain and fibre-reinforced concrete pipelines were similar in the first and second stages of damage. However, in the pipeline constructed using both plain and fibre-reinforced concrete pipe segments, the damage was concentrated in the standard reinforced concrete pipe segments