41 research outputs found

    Analysis of RC Continuous Beams Strengthened with FRP Plates: A Finite Element Model

    Get PDF
    Strengthening of reinforced concrete (RC) beams with externally bonded fibre reinforced polymer (FRP) plates/sheets technique has become widespread in the last two decades. Although a great deal of research has been conducted on simply supported RC beams, a few studies have been carried out on continuous beams strengthened with FRP composites.  This paper presents a simple uniaxial nonlinear finite-element model (UNFEM) that is able to accurately estimate the load-carrying capacity and the behaviour of RC continuous beams flexurally strengthened with externally bonded FRP plates on both of the upper and lower fibres. A 21-degree of freedom element is proposed with layer-discretization of the cross-sections for finite element (FE) modelling. Realistic nonlinear constitutive relations are employed to describe the stress-strain behaviour of each component of the strengthened beam. The FE model is based on nonlinear fracture mechanics. The interfacial shear and normal stresses in the adhesive layer are presented using an analytical uncoupled cohesive zone model with a mixed-mode fracture criterion. The results of the proposed FE model are verified by comparison with various selected experimental measurements available in the literature. The numerical results of the plated beams (beams strengthened with FRP plates) agreed very well with the experimental results. The use of FRP increased the ultimate load capacity up to 100 % compared with the non-strengthened beams as occurred in series (S). The major objective of the current model is to help engineers’ model FRP-strengthened RC continuous beams in a simple manner

    An Experimental Investigation on the Effect of Calcium Chloride As Dust Suppressant on the Strength of Unpaved Road

    Get PDF
    The quality of the gravel used in road construction has a profound positive impact on road service life. The potential use of calcium chloride as a dust control agent and material for stabilizing the base of unpaved roads has been researched. However, the quality of the gravel may be impacted if calcium chloride is introduced as a dust suppressant into it. The main aim of this research is to examine the increase in strength of soil specimens treated with calcium chloride and to evaluate how different proportions of calcium chloride as a dust suppressant and base stabilizer affect the California Bearing Ratio value of the soil samples. Mixtures of natural gravel and gravel with varying amounts of calcium chloride were analyzed for their Atterberg limits, grading, maximum dry density, CBR properties, and optimum moisture content. The changes in the characteristics of the gravel - calcium chloride mixtures were analyzed. It was found that the particle size distribution and Atterberg limits remained largely unchanged. However, the optimum moisture content (OMC) decreased from 9.2% to 7.6%, 7.4%, and 7.2% with calcium chloride added at percentages of 2 percent, 3 percent, and 4 percent per volume of dry soil, respectively. Observation of the mixture revealed an increase in the maximum dry density (MDD) as the ratios of calcium chloride were altered. The maximum dry density significantly increased from 2.15 Mg / m3 to 2.31 Mg/m3, 2.35 Mg/m3, and 2.36 Mg/m3, respectively. Along with this, the California Bearing Ratio (CBR) demonstrated an improvement of 25% to 29%, 32%, and 36% at 95% compaction with an increase in the ratios of calcium chloride. The increase in dry density can be explained by the improved bonding between particles and the reduction of air voids. This increase in dry density, in turn, positively influences the California Bearing Ratio by transforming soil structure from a dispersed state to a flocculated state. It can be inferred from the results that calcium chloride has the potential to function as a stabilizer for unpaved roads. The findings of this study are expected to reduced life cycle costs for unpaved roads, provide insights for the best approach to materials analysis for unpaved roads, and contribute to environmental benefits by minimizing dust emissions into the atmosphere and reducing the release of chemicals into nature

    Radial neck fracture in children: anatomic and functional results of Metaizeau technique

    Get PDF
    Fractures of the radial neck accounts for 1% of all childhood fractures and 5% to 10% of childhood traumatic lesions involving the elbow. Intramedullary percutaneous nail reduction (Metaizeau technique) is considered the most effective surgical technique. The purpose of this study was to identify the main clinical features of radial neck fracture in children and to evaluate the anatomical and functional results of the Metaizeau technique. In this retrospective study, we evaluated 22 patients under the age of 16 who were treated for radial neck fracture at the orthopedic and trauma surgery department of Sahloul University Hospital in Sousse over a period of 16 years from January 2001 to April 2017. Authors used Metaizeau classification. Functional results were evaluated by Mayo elbow performance score (MEPS) and the radiological evaluation was based on standard images with measurement of the residual rocker. The average age was 8.6 years (5-13 years). Seven fracture were grade III injuries and three grade IV. In the immediate postoperative period, radiological measurements showed a residual rocker less than 20° in 86.3% and more than 20° in 13.7% of cases. At an average follow-up of 13 months and a half, the MEPS score was excellent and good for 17 patients. Four types of complications were found: necrosis of the radial head in 1 case, pseudarthrosis in 1 case, periarticular calcification in 2 cases and stiff-ness of the elbow in 3 cases. Despite the small number of patients in our series, we believe that the elastic stable intramedullary pinning according to the Metaizeau technique is the treatment of choice for displaced radial neck fractures in children

    Limited Phosphorous Supply Improved Lipid Content of Chlorella vulgaris That Increased Phenol and 2-Chlorophenol Adsorption from Contaminated Water with Acid Treatment

    Get PDF
    Phenolic compounds are toxic and ominously present in industrial effluents, which can end up in water bodies, causing potential damage to living organisms. This study employed the dried biomass of freshwater green microalgae Chlorella vulgaris to remove phenol and 2-chlorophenol from an aqueous environment. C. vulgaris was grown under different phosphorus- (P) starved conditions, and biomass was treated with sulfuric acid. It was observed that reducing the P level enhanced the lipid content by 7.8 times while decreasing protein by 7.2 times. P-starved C. vulgaris dried biomass removed phenol and 2-chlorophenol by 69 and 57%, respectively, after 180 min from the contaminated water. Acid-treated P-starved C. vulgaris dried biomass removed phenol and 2-chlorophenol by 77 and 75%, respectively, after 180 min. Thus, an economical and eco-friendly P-starved and acid treated C. vulgaris biomass has better potential to remove phenol and 2-chlorophenol from contaminated ground water and industrial wastewater.This research has been funded by Scientific Research Deanship at University of Ha’il—Saudi Arabia through project number RG-21 105

    Global, regional, and national incidence and mortality for HIV, tuberculosis, and malaria during 1990–2013: a systematic analysis for the Global Burden of Disease Study 2013

    Get PDF
    BACKGROUND: The Millennium Declaration in 2000 brought special global attention to HIV, tuberculosis, and malaria through the formulation of Millennium Development Goal (MDG) 6. The Global Burden of Disease 2013 study provides a consistent and comprehensive approach to disease estimation for between 1990 and 2013, and an opportunity to assess whether accelerated progress has occured since the Millennium Declaration. METHODS: To estimate incidence and mortality for HIV, we used the UNAIDS Spectrum model appropriately modified based on a systematic review of available studies of mortality with and without antiretroviral therapy (ART). For concentrated epidemics, we calibrated Spectrum models to fit vital registration data corrected for misclassification of HIV deaths. In generalised epidemics, we minimised a loss function to select epidemic curves most consistent with prevalence data and demographic data for all-cause mortality. We analysed counterfactual scenarios for HIV to assess years of life saved through prevention of mother-to-child transmission (PMTCT) and ART. For tuberculosis, we analysed vital registration and verbal autopsy data to estimate mortality using cause of death ensemble modelling. We analysed data for corrected case-notifications, expert opinions on the case-detection rate, prevalence surveys, and estimated cause-specific mortality using Bayesian meta-regression to generate consistent trends in all parameters. We analysed malaria mortality and incidence using an updated cause of death database, a systematic analysis of verbal autopsy validation studies for malaria, and recent studies (2010-13) of incidence, drug resistance, and coverage of insecticide-treated bednets. FINDINGS: Globally in 2013, there were 1·8 million new HIV infections (95% uncertainty interval 1·7 million to 2·1 million), 29·2 million prevalent HIV cases (28·1 to 31·7), and 1·3 million HIV deaths (1·3 to 1·5). At the peak of the epidemic in 2005, HIV caused 1·7 million deaths (1·6 million to 1·9 million). Concentrated epidemics in Latin America and eastern Europe are substantially smaller than previously estimated. Through interventions including PMTCT and ART, 19·1 million life-years (16·6 million to 21·5 million) have been saved, 70·3% (65·4 to 76·1) in developing countries. From 2000 to 2011, the ratio of development assistance for health for HIV to years of life saved through intervention was US$4498 in developing countries. Including in HIV-positive individuals, all-form tuberculosis incidence was 7·5 million (7·4 million to 7·7 million), prevalence was 11·9 million (11·6 million to 12·2 million), and number of deaths was 1·4 million (1·3 million to 1·5 million) in 2013. In the same year and in only individuals who were HIV-negative, all-form tuberculosis incidence was 7·1 million (6·9 million to 7·3 million), prevalence was 11·2 million (10·8 million to 11·6 million), and number of deaths was 1·3 million (1·2 million to 1·4 million). Annualised rates of change (ARC) for incidence, prevalence, and death became negative after 2000. Tuberculosis in HIV-negative individuals disproportionately occurs in men and boys (versus women and girls); 64·0% of cases (63·6 to 64·3) and 64·7% of deaths (60·8 to 70·3). Globally, malaria cases and deaths grew rapidly from 1990 reaching a peak of 232 million cases (143 million to 387 million) in 2003 and 1·2 million deaths (1·1 million to 1·4 million) in 2004. Since 2004, child deaths from malaria in sub-Saharan Africa have decreased by 31·5% (15·7 to 44·1). Outside of Africa, malaria mortality has been steadily decreasing since 1990. INTERPRETATION: Our estimates of the number of people living with HIV are 18·7% smaller than UNAIDS's estimates in 2012. The number of people living with malaria is larger than estimated by WHO. The number of people living with HIV, tuberculosis, or malaria have all decreased since 2000. At the global level, upward trends for malaria and HIV deaths have been reversed and declines in tuberculosis deaths have accelerated. 101 countries (74 of which are developing) still have increasing HIV incidence. Substantial progress since the Millennium Declaration is an encouraging sign of the effect of global action. FUNDING: Bill & Melinda Gates Foundation

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P < 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    Breast cancer management pathways during the COVID-19 pandemic: outcomes from the UK ‘Alert Level 4’ phase of the B-MaP-C study

    Get PDF
    Abstract: Background: The B-MaP-C study aimed to determine alterations to breast cancer (BC) management during the peak transmission period of the UK COVID-19 pandemic and the potential impact of these treatment decisions. Methods: This was a national cohort study of patients with early BC undergoing multidisciplinary team (MDT)-guided treatment recommendations during the pandemic, designated ‘standard’ or ‘COVID-altered’, in the preoperative, operative and post-operative setting. Findings: Of 3776 patients (from 64 UK units) in the study, 2246 (59%) had ‘COVID-altered’ management. ‘Bridging’ endocrine therapy was used (n = 951) where theatre capacity was reduced. There was increasing access to COVID-19 low-risk theatres during the study period (59%). In line with national guidance, immediate breast reconstruction was avoided (n = 299). Where adjuvant chemotherapy was omitted (n = 81), the median benefit was only 3% (IQR 2–9%) using ‘NHS Predict’. There was the rapid adoption of new evidence-based hypofractionated radiotherapy (n = 781, from 46 units). Only 14 patients (1%) tested positive for SARS-CoV-2 during their treatment journey. Conclusions: The majority of ‘COVID-altered’ management decisions were largely in line with pre-COVID evidence-based guidelines, implying that breast cancer survival outcomes are unlikely to be negatively impacted by the pandemic. However, in this study, the potential impact of delays to BC presentation or diagnosis remains unknown

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century
    corecore