64 research outputs found

    Diagnostic value of CD14+ CD16+ monocytes in neonatal sepsis

    Get PDF
    Background: The majority of monocytes (MO) are strongly positive for CD14 and negative for CD16. The phenotype and function of peripheral blood monocytes change after trauma and during sepsis. CD14+CD16+ monocytes, identified as a minor population of monocytes which constitute a potent phagocytosing and antigen-presenting monocyte subpopulation that expands during acute and chronic infections. Objective: To evaluate monocyte expression of CD14 and CD16 in preterm neonates and to assess it as a possible marker for early diagnosis of neonatal sepsis as the early clinical signs are often insidious and non-specific. Methods: This study was carried out on 45 preterm neonates (1-3 days old ) with a mean gestational age of 34.5 ± 1.03 weeks . They were classified into three groups. Group I included 15 neonates with proven sepsis. Group II included 15 neonates with possible or suspected infection. Group III (control group) included 15 healthy age and sex matched neonates. The neonates with possible infection were followed up. Nine of them developed sepsis later on (proved clinically and by laboratory) and they were considered as patients with early sepsis at the time of admission. History taking and clinical examination were performed as well as laboratory investigations including, complete blood count, blood culture and sensitivity (for patients only), measurement of C-reactive protein (CRP) and CD14 and CD16 expression on monocytes by flow cytometry. Results: The proportion of CD14+ CD16+ MO within all circulating monocytes was significantly higher in patients with proven (75.2±13.1%), early (63.9±17.9%) or possible sepsis (55.1±26.8%) than controls (3.86±2.53%) (p < 0.0001, p < 0.0001, p < 0.001, respecctively). It was higher in neonates with proven than possible sepsis (p > 0.05), whereas it was comparable in the groups of proven and early sepsis (p < 0.05). There was a significant positive correlation between mean fluorescence intensity (MFI) of CD16+ MO and CRP (p < 0.01) and a significant negative correlation between it and the platelet count (p < 0.05) among patients. When neonates with early sepsis were followed up after 48 hours a significant increase in CRP levels and MFI of CD16 expression on monocytes was noted (p < 0.01 for both). The sensitivity and negative predictive value of CD14+ CD16+ MO% and MFI of CD16+ MO were higher than that of CRP. Specificity and positive predictive value of CD14+CD16+ MO% were similar to those of CRP. The cut off point (obtained from the ROC curve) for CD14+ CD16+ MO% was 8.6% and that for MFI of CD16+ MO was 9. Conclusion: The measurement of the percentage of CD14+ CD16+ MO among circulating MO is a promising rapid and sensitive test for early diagnosis of neonatal sepsis and exclusion of infection in neonates with high risk to develop sepsis. NICU costs as well as unnecessary antibiotic use can be thus reduced.Keywords: CD14, CD16, monocyte, neonate, sepsisEgypt J Pediatr Allergy Immunol 2004; 2(1): 16-2

    Geochemical characterization of recent Nile Delta inner shelf sediments: Tracing natural and human-induced alterations into a deltaic system

    Get PDF
    Abstract The present study deals with the geochemical changes observed along Nile Delta inner shelf sediments over a period of 20 years (1995–2015). Major, minor, and trace constituents as well as rare earth elements (REE) were investigated in the surface sediments collected from seven transects along the inner shelf on five years intervals. Geochemical composition of sediments in Nile Delta inner shelf exhibits continuous changes over time due to the depositional and sediment transport processes. The sediments are generally enriched with Fe and Ti oxides, as well as Ta, Nb, Y in comparison to the Upper Continental Crust (UCC). These alterations signify the impact of processes such as erosion and sediment transport, as well as the impact of anthropogenic interferences such as damming the Nile River Flow. The reduction of the sediment input from the Nile River has somehow altered the geochemical signature of the inner shelf sediments. The REE patterns indicate weathering in areas subjected to erosion, while trace elements and major oxides spatial and temporal distributions concentrate eastwards under the influence of the easterly sediment transport pattern. Nile Delta inner shelf presented a typical case for understanding the link between geochemistry and sedimentary processes in nearshore and deltaic systems

    Application of Stabilized Silver Nanoparticles as Thin Films as Corrosion Inhibitors for Carbon Steel Alloy in 1 M Hydrochloric Acid

    Get PDF
    Nanometer scaled materials have attracted tremendous interest as corrosion protective films due to their high ability to form self-assembled films on the metal surfaces. It is well known that the silver nanoparticles have higher reactivity towards aqueous acidic solution. The present work aims to prepare coated silver nanoparticles to protect carbon steel alloys from aqueous acidic corrosive media. In this respect, Ag nanoparticles colloid solutions were produced through reducing AgNO3 separately with trisodium citrate in an aqueous solution or in the presence of stabilizer such as poly(ethylene glycol) thiol and poly(vinyl pyrrolidone). The morphology of the modified silver nanoparticles was investigated by TEM and DLS. UV-Vis absorption spectrum was used to study the effect of HCl on the stability of the dispersed silver nanoparticles. The corrosion inhibition efficiency of the poly (ethylene glycol)thiol, the self-assembled monolayers of Ag nanoparticles, was determined by polarization method and electrochemical impedance spectroscopy (EIS). Polarization curves indicated that the coated silver poly (ethylene glycol)thiol acted as a mixed type inhibitor. The data of inhibition efficiencies obtained measured by polarization measurements are in good agreement with those obtained with electrochemical impedance

    Gender Differences in Presentation, Management, and In-Hospital Outcomes for Patients with AMI in a Lower-Middle Income Country: Evidence from Egypt

    Get PDF
    BACKGROUND: Many studies in high-income countries have investigated gender differences in the care and outcomes of patients hospitalized with acute myocardial infarction (AMI). However, little evidence exists on gender differences among patients with AMI in lower-middle-income countries, where the proportion deaths stemming from cardiovascular disease is projected to increase dramatically. This study examines gender differences in patients in the lower-middle-income country of Egypt to determine if female patients with AMI have a different presentation, management, or outcome compared with men. METHODS AND FINDINGS: Using registry data collected over 18 months from 5 Egyptian hospitals, we considered 1204 patients (253 females, 951 males) with a confirmed diagnosis of AMI. We examined gender differences in initial presentation, clinical management, and in-hospital outcomes using t-tests and χ(2) tests. Additionally, we explored gender differences in in-hospital death using multivariate logistic regression to adjust for age and other differences in initial presentation. We found that women were older than men, had higher BMI, and were more likely to have hypertension, diabetes mellitus, dyslipidemia, heart failure, and atrial fibrillation. Women were less likely to receive aspirin upon admission (p<0.01) or aspirin or statins at discharge (p = 0.001 and p<0.05, respectively), although the magnitude of these differences was small. While unadjusted in-hospital mortality was significantly higher for women (OR: 2.10; 95% CI: 1.54 to 2.87), this difference did not persist in the fully adjusted model (OR: 1.18; 95% CI: 0.55 to 2.55). CONCLUSIONS: We found that female patients had a different profile than men at the time of presentation. Clinical management of men and women with AMI was similar, though there are small but significant differences in some areas. These gender differences did not translate into differences in in-hospital outcome, but highlight differences in quality of care and represent important opportunities for improvement

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Effects of hospital facilities on patient outcomes after cancer surgery: an international, prospective, observational study

    Get PDF
    Background Early death after cancer surgery is higher in low-income and middle-income countries (LMICs) compared with in high-income countries, yet the impact of facility characteristics on early postoperative outcomes is unknown. The aim of this study was to examine the association between hospital infrastructure, resource availability, and processes on early outcomes after cancer surgery worldwide.Methods A multimethods analysis was performed as part of the GlobalSurg 3 study-a multicentre, international, prospective cohort study of patients who had surgery for breast, colorectal, or gastric cancer. The primary outcomes were 30-day mortality and 30-day major complication rates. Potentially beneficial hospital facilities were identified by variable selection to select those associated with 30-day mortality. Adjusted outcomes were determined using generalised estimating equations to account for patient characteristics and country-income group, with population stratification by hospital.Findings Between April 1, 2018, and April 23, 2019, facility-level data were collected for 9685 patients across 238 hospitals in 66 countries (91 hospitals in 20 high-income countries; 57 hospitals in 19 upper-middle-income countries; and 90 hospitals in 27 low-income to lower-middle-income countries). The availability of five hospital facilities was inversely associated with mortality: ultrasound, CT scanner, critical care unit, opioid analgesia, and oncologist. After adjustment for case-mix and country income group, hospitals with three or fewer of these facilities (62 hospitals, 1294 patients) had higher mortality compared with those with four or five (adjusted odds ratio [OR] 3.85 [95% CI 2.58-5.75]; p&lt;0.0001), with excess mortality predominantly explained by a limited capacity to rescue following the development of major complications (63.0% vs 82.7%; OR 0.35 [0.23-0.53]; p&lt;0.0001). Across LMICs, improvements in hospital facilities would prevent one to three deaths for every 100 patients undergoing surgery for cancer.Interpretation Hospitals with higher levels of infrastructure and resources have better outcomes after cancer surgery, independent of country income. Without urgent strengthening of hospital infrastructure and resources, the reductions in cancer-associated mortality associated with improved access will not be realised

    Global prevalence and genotype distribution of hepatitis C virus infection in 2015 : A modelling study

    Get PDF
    Publisher Copyright: © 2017 Elsevier LtdBackground The 69th World Health Assembly approved the Global Health Sector Strategy to eliminate hepatitis C virus (HCV) infection by 2030, which can become a reality with the recent launch of direct acting antiviral therapies. Reliable disease burden estimates are required for national strategies. This analysis estimates the global prevalence of viraemic HCV at the end of 2015, an update of—and expansion on—the 2014 analysis, which reported 80 million (95% CI 64–103) viraemic infections in 2013. Methods We developed country-level disease burden models following a systematic review of HCV prevalence (number of studies, n=6754) and genotype (n=11 342) studies published after 2013. A Delphi process was used to gain country expert consensus and validate inputs. Published estimates alone were used for countries where expert panel meetings could not be scheduled. Global prevalence was estimated using regional averages for countries without data. Findings Models were built for 100 countries, 59 of which were approved by country experts, with the remaining 41 estimated using published data alone. The remaining countries had insufficient data to create a model. The global prevalence of viraemic HCV is estimated to be 1·0% (95% uncertainty interval 0·8–1·1) in 2015, corresponding to 71·1 million (62·5–79·4) viraemic infections. Genotypes 1 and 3 were the most common cause of infections (44% and 25%, respectively). Interpretation The global estimate of viraemic infections is lower than previous estimates, largely due to more recent (lower) prevalence estimates in Africa. Additionally, increased mortality due to liver-related causes and an ageing population may have contributed to a reduction in infections. Funding John C Martin Foundation.publishersversionPeer reviewe

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001). Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication

    A new online scheduling approach for enhancing QOS in cloud

    Get PDF
    Quality-of-Services (QoS) is one of the most important requirements of cloud users. So, cloud providers continuously try to enhance cloud management tools to guarantee the required QoS and provide users the services with high quality. One of the most important management tools which play a vital role in enhancing QoS is scheduling. Scheduling is the process of assigning users’ tasks into available Virtual Machines (VMs). This paper presents a new task scheduling approach, called Online Potential Finish Time (OPFT), to enhance the cloud data-center broker, which is responsible for the scheduling process, and solve the QoS issue. The main idea of the new approach is inspired from the idea of passing vehicles through the highways. Whenever the width of the road increases, the number of passing vehicles increases. We apply this idea to assign different users’ tasks into the available VMs. The number of tasks that are allocated to a VM is in proportion to the processing power of this VM. Whenever the VM capacity increases, the number of tasks that are assigned into this VM increases. The proposed OPFT approach is evaluated using the CloudSim simulator considering real tasks and real cost model. The experimental results indicate that the proposed OPFT algorithm is more efficient than the FCFS, RR, Min-Min, and MCT algorithms in terms of schedule length, cost, balance degree, response time and resource utilization
    corecore