53 research outputs found

    Effects of Deicing Salts on the Durability of Concrete Incorporating Supplementary Cementitious Materials

    Get PDF
    Structural Engineering and Materials Laboratory of the University of Kansa

    EFFECTS OF DEICING SALTS ON THE DURABILITY OF CONCRETE INCORPORATING SUPPLEMENTARY CEMENTITIOUS MATERIALS

    Get PDF
    The durability of concrete mixtures incorporating one of two supplementary cementitious materials (SCMs), slag cement or Class C fly ash, exposed to sodium chloride (NaCl), calcium chloride (CaCl2), or magnesium chloride (MgCl2) is evaluated based on damage due to wetting and drying and scaling, with the goal of determining appropriate replacement levels of portland cement using these SCMs. The mixtures had water-to-cementitious material ratios of 0.38 and 0.44 and SCM percentage replacements of portland cement of 0%, 20%, 35%, and 50% by volume. Fourteen concrete mixtures (204 specimens) were cast to evaluate the damage to concrete subjected to 300 cycles of wetting and drying while exposed to solutions of one of the three deicing salts or deionized water in which the temperature of the specimens ranged from 39.2 ºF (4 ºC) during the wetting cycle to 73 ± 3 ºF (23 ± 2 ºC) during the drying cycle. Of special interest were the effects of CaCl2 and MgCl2, which result in the formation of calcium oxychloride. Durability was evaluated based on the average relative dynamic modulus of elasticity and the nature of physical damage, if any. Mixtures subjected to CaCl2 or MgCl2 that exhibit no spalling at test completion are considered to be durable. Ten concrete mixtures (156 specimens) were cast with curing periods of 14 or 28 days to investigate scaling over 56 cycles in accordance with Quebec test BNQ NQ 2621-900 using NaCl or CaCl2. Mixtures are considered durable if the average cumulative mass losses are less than the BNQ NQ 2621-900 failure limit of 0.10 lb/ft2. The results show that wetting and drying with deionized water or an NaCl solution does not cause deterioration of concrete. Exposure to CaCl2 and MgCl2, however, both of which result in the formation of calcium oxychloride, causes physical damage and a reduction in the dynamic modulus of concrete with portland cement as the only binder, with CaCl2 being the more deleterious of the two. A partial replacement of portland cement with either slag cement or Class C fly ash is effective in producing durable concrete exposed to CaCl2 or MgCl2. Using a 20% replacement of portland cement with an SCM, however, is not sufficient to produce durable concrete under conditions that result in the formation of calcium oxychloride, while replacing 35% or 50% of the portland cement with one of the SCMs used in this study is. The results, also, show that using slag cement or Class C fly ash as a replacement of portland cement results in an increase in scaling compared to concrete with portland cement as the only binder. For mixtures with portland cement as the only binder and with a 20% slag cement replacement of portland cement, CaCl2 causes somewhat more scaling than NaCl. For mixtures with a replacement percentage using either SCM of 35%, NaCl causes more scaling than CaCl2. In all cases, however, the scaling mass losses for mixtures with 35% SCM replacements of portland cement were below the BNQ NQ 2621-900 failure limit. At 50% volume replacements, the increase in scaling is noticeably higher than with 20% and 35% replacements, especially for mixtures exposed to NaCl. The mass losses for mixtures with a 50% SCM replacement of portland cement exposed to NaCl exceeded the BNQ NQ 2621-900 failure limit. Extending the curing period from 14 to 28 days has no measurable effect on the scaling for most concrete mixtures in the study. Based on the findings of the wetting and drying and scaling tests, a partial replacement of portland cement with either slag cement or Class C fly ash is essential to produce durable concrete that will be subjected to the deicing salts CaCl2 or MgCl2 that cause the formation of calcium oxychloride. Using a 20% volume replacement of portland cement is not adequate, while a 35% volume replacement is. Replacement percentages above 35%, however, are not recommended when the deicing salt NaCl may be used because of increasingly poor scaling resistance with increasing slag cement and Class C fly ash replacement levels

    A Multiband Fractal Dipole Antenna for Wireless Communication Applications

    Get PDF
    mmunication applications. The proposed fractal antenna design is based onfractal geometry of the second level tent function transformation. Due to theresulting geometrical structures of a fractal tent function curve depend on thestarting angles of the initial tent function, many dipole antennas have beenmodeled and the corresponding radiation characteristics have been evaluated.Theoretical performance of these antennas has been calculated using the methodof moments (MoM) electromagnetic simulator, IE3D. Simulation results of manytent fractal dipole antennas which have been modeled show that all of theseantennas have multiband resonate behavior, but this resonate behavior is differentaccording to the starting angle for each antenna. The results have shown that theseantennas have acceptable performance for VSWR ≤ 2 (return loss ≤ -10 dB),using a 50W feed line, at most of the resonating frequencies. This feature providesantenna designer with more degree of freedom, and makes the proposed antenna(or its monopole counterpart) suitable for use in the modern multi-functionscommunication system

    Assessment of Moisture-Tolerant Coatings for Decreasing Open Top Construction Time

    Get PDF
    Open top construction is a practice commonly used in the construction of large structures, such as nuclear power plants, as it allows large equipment to be easily placed by lowering it into position from above. Doing so, however, requires the concrete floor to be finished and coated prior to placement. Current coating manufacturer recommendations state that concrete should be allowed to dry for a minimum of 28 days prior to coating application to avoid compromising the bond between the coating and the concrete or between coating layers that could result from excessive moisture. This requirement delays the construction process, adding significant costs. The ability to apply coatings without damage prior to 28 days would greatly reduce construction time and cost. Ten coating systems were evaluated in this study. The coatings were applied 7, 14, 21, 28, and 45 days after the end of wet curing. Coating adhesion was evaluated using the Standard Test Method for Pull-Off Strength of Coatings on Concrete Using Portable Adhesion Testers and the Standard Test Method for Evaluating Adhesion by Knife 7, 21, 28, and 56 days after application of the final top layer of the coating systems. Moisture vapor emission rate (MVER) and concrete relative humidity (RH) were monitored throughout the tests. Most but not all of the coatings investigated in this study may be applied to concrete as early as 7 days after completion of wet curing, at MVER values over 10 lb/1000 ft2/day (565 μg/m2/s) and internal relative humidity (RH) above 80%, without significant adverse effects on coating adhesion, offering the potential to speed open top construction of nuclear power plants. The thickness of concrete does not affect the value or rate of change in MVER or RH. Thicker coatings exhibit relatively poor performance in the knife test compared to thinner coatings. Coating systems should be evaluated to ensure that they can be successfully applied at early ages. Larger-scale prototype early-age applications should be performed and subjected to the full range of required testing for the appropriate Service Level prior to wide-scale application of these findings

    Analysis of Genetic Diversity in Some Rice Varieties and Their Performance in Bangladesh

    Get PDF
    This study was conducted to evaluate the genetic diversity for several rice features and their association with yields, as well as to identify genotypes of short-duration rice. The experiment was conducted in the field in a natural environment, and data were collected on several plant parameters for each genotype at various phases of plant development. Twenty genotypes of rice were examined based on their morphological and physiological characteristics. From July through December of 2020, the experiment was conducted at the Bangladesh Rice Research Institute's regional station in Shyampur, Rajshahi. There was significant diversity among the twenty rice genotypes for all characteristics tested. The genotype BRRI dhan57 displayed the shortest days to flowering. In terms of days to maturity, the genotype BRRI dhan57 was the earliest, with a maturity time of 107.33 days, followed by BRRI dhan56 and BRRI dhan39. The days to blooming had the highest heritability (99.75%), followed by the days to maturity (99.58%), grain yield (85.30%), thousand grain weight (85.22%), grains per panicle (84.91%), plant height (82.21%), and tillers per hill (21.61%). High heritability scores indicated that the researched qualities were less influenced by the surrounding environment. As a percentage of the mean, the genetic gain was greatest for grain yield (36.33%) and lowest for tillers per hill (6.60%) among the yield-contributing factors. In days to flowering, days to maturity, grains per panicle, and plant height, high heritability and genetic progress were seen. According to the principal component analysis (PCA), the Eigen values of the first four components of the total variance accounted for 89.46% of the total variance, indicating that these components were mostly responsible for the genetic diversity of the current materials. It was the largest cluster, containing seven rice genotypes. Clusters II and V contained five and four genotypes of rice, respectively. Clusters III and IV were the smallest, with only two genotypes apiece. The pattern of distribution of genotypes among various clusters demonstrated the significant genetic variety present in the genotypes, which may be the result of adaptation of these genotypes to certain environmental conditions. The largest value of intercluster distance indicated that cluster III genotypes were extremely distinct from cluster IV genotypes. Negative values in both vectors for tillers per hill suggested that this feature contributed the least to the total diversity. The number of panicles per hill, panicle length, weight per thousand grains, and grain yield were all positive in both directions. According to these statistics, these four characteristics contributed the most to the variety. View Article DOI: 10.47856/ijaast.2022.v09i12.00

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001). Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Effects of hospital facilities on patient outcomes after cancer surgery: an international, prospective, observational study

    Get PDF
    Background Early death after cancer surgery is higher in low-income and middle-income countries (LMICs) compared with in high-income countries, yet the impact of facility characteristics on early postoperative outcomes is unknown. The aim of this study was to examine the association between hospital infrastructure, resource availability, and processes on early outcomes after cancer surgery worldwide.Methods A multimethods analysis was performed as part of the GlobalSurg 3 study-a multicentre, international, prospective cohort study of patients who had surgery for breast, colorectal, or gastric cancer. The primary outcomes were 30-day mortality and 30-day major complication rates. Potentially beneficial hospital facilities were identified by variable selection to select those associated with 30-day mortality. Adjusted outcomes were determined using generalised estimating equations to account for patient characteristics and country-income group, with population stratification by hospital.Findings Between April 1, 2018, and April 23, 2019, facility-level data were collected for 9685 patients across 238 hospitals in 66 countries (91 hospitals in 20 high-income countries; 57 hospitals in 19 upper-middle-income countries; and 90 hospitals in 27 low-income to lower-middle-income countries). The availability of five hospital facilities was inversely associated with mortality: ultrasound, CT scanner, critical care unit, opioid analgesia, and oncologist. After adjustment for case-mix and country income group, hospitals with three or fewer of these facilities (62 hospitals, 1294 patients) had higher mortality compared with those with four or five (adjusted odds ratio [OR] 3.85 [95% CI 2.58-5.75]; p&lt;0.0001), with excess mortality predominantly explained by a limited capacity to rescue following the development of major complications (63.0% vs 82.7%; OR 0.35 [0.23-0.53]; p&lt;0.0001). Across LMICs, improvements in hospital facilities would prevent one to three deaths for every 100 patients undergoing surgery for cancer.Interpretation Hospitals with higher levels of infrastructure and resources have better outcomes after cancer surgery, independent of country income. Without urgent strengthening of hospital infrastructure and resources, the reductions in cancer-associated mortality associated with improved access will not be realised

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore