22 research outputs found

    Reduced glomerular filtration rate as a predictor of coronary artery disease events in elderly patients

    Get PDF
    Background: Chronic kidney disease is independently associated with cardiovascular disease (CVD) events in high-risk populations according to several studies. However, findings from community-based population studies are insufficient. We studied the relationship between estimated glomerular filtration rate (eGFR) and risk of coronary artery disease (CAD) events in patients attending Zagazig University Hospital, Sharqiya governorate, Egypt.Methods: A total of 800 subjects aged ≥ 60 years admitted to Internal Medicine Department or attended medicine outpatient clinic were included in this study. Careful history and full clinical examinations were done to assess the risk factors of CAD. Serum creatinine, lipid profile and serum glucose were measured. Estimated eGFR was evaluated by creatinine based MDRD formula. According to eGFR, patients were divided into 2 groups: group 1 with eGFR ≥ 60 mL/min/1.73 m2 and Group 2 with eGFR < 60 mL/min/1.73 m (between 40 and 60 mL/min/1.73 m).Results: 410 patients were found to have eGFR P 60 mL/min/1.73 m2, while 390 patients were found to have eGFR < 60 mL/min/1.73 m2. eGFR was lower in patients with CAD (62 ± 13 mL/min/1.73 m2) in comparison with patients without CAD (76 ± 11 mL/min/1.73 m2) (P  ≤ 0.001). Older age, hypertension, Diabetes and Low HDL are highly significant risk factors for CAD in those patients (P 0.001).Conclusions: Reduced eGFR is a significant risk factor for CAD events in older patients. Monitoring of eGFR may have a pivotal role in early detection and management of CAD in those types of patients.Keywords: Coronary artery disease; Glomerular filtration rate; Elderl

    Chronic pain in hemodialysis patients: Role of bone mineral metabolism

    Get PDF
    Background: Pain is one of the most common complaints in clinical practice because it is a symptom for a myriad of physical and mental problems. The high prevalence of pain in the chronic kidney disease (CKD) population is particularly concerning because pain has been shown to adversely affect quality of life. The aim of this study was to evaluate the prevalence and possible causes of chronic pain in patients with end stage renal disease on long-term hemodialysis (HD).Methods: We prospectively enrolled 100 patients who were undergoing maintenance HD for at least 6 months or more. Pain was evaluated using the Brief Pain Inventory (BPI). Data collected on each participant included age, gender, body mass index (BMI), time on dialysis and biochemical findings.Results: The average age was 42.06 years ranged from 22 to 58 years; the average duration on dialysis was 4.97 years. 52 patients were males and 48 were females. Although 52% of patients experienced chronic pain, only 25% described the pain as severe, 28% described pain as moderate while 52% of patients described as mild. Musculoskeletal pain was the most frequent form of chronic pain reported by patients who were on HD (54%). Malnutrition and high CRP were highly statistically associated with chronic pain (p< 0.001). High statistical significant correlation was found between lower calcium, lower 25(OH) D3 levels, higher parathyroid hormone (PTH) levels and experienced chronic pain (p< 0.001).Conclusion: Chronic pain is highly experienced in long-term hemodialysis patients. Malnutrition, high CRP and disturbed bone mineral metabolism are highly correlated with the incident of this pain

    The relationship between serum osteopontin level and parameters of Chronic Kidney Disease – mineral bone disease in patients on regular hemodialysis

    Get PDF
    Background: Chronic Kidney Disease (CKD) is becoming a major health concern worldwide. For many patients, CKD is associated with substantial morbidity and mortality. Osteopontin (OPN) is an extracellular matrix protein first identified in bone tissue and has pleiotropic functions due to its common expression in the main organs and apparatuses. It is a phosphorylated glycophosphoprotein composed of 314 amino acids, involved in biomineralization and remodeling.Objective: This research aimed to assess the serum level of osteopontin in patients with end-stage renal disease (ESRD) on regular haemodialysis and to correlate osteopontin level in patients with ESRD on hemodialysis with other biomarkers CKD-MBD.Patients & Methods: This Study was conducted on 160 participants that were divided into two groups. Control group included 80 healthy subjects of both sexes, and patients group that included 80 ESRD patients on regular hemodialysis of both sexes. All studied groups were subjected to osteopontin level by enzyme-linked immunosorbent assay (ELISA).Results: Serum osteopontin levels were higher in ESRD patients on regular dialysis than in healthy individuals, where it might have a higher predictive value for CKD development. Also, they were positively correlated with serum phosphorus, serum alkaline phosphatase and serum parathyroid hormone, which are parameters of chronic kidney disease-mineral and bone disorder.Conclusion: Osteopontin may be considered an early marker of chronic kidney disease

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001). Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P &lt; 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Diagnostic performance and inter-observer variability of CO-RADS in the triage of patients with suspected COVID-19 infection : initial experience in Zagazig University Hospital

    No full text
    Purpose: In many healthcare settings in developing nations, multislice computed tomography (MSCT) imaging may be the only available diagnostic modality for patients with suspected COVID-19 infection, due to a shortage of laboratory kits. This study aimed to evaluate the diagnostic performance and interobserver variability of CO-RADS (COVID-19 Reporting and Data System) in the triage of patients with suspected COVID-19 infection in Zagazig University Hospital. Material and methods: This study included 2500 patients with suspected COVID-19 infection, mean age 60.61 years ± 13.89. 61.4% were male. Unstable patients requiring urgent invasive ventilation, acute coronary syndrome patients, pregnant females, and patients with RT-PCR results available prior to MSCT were excluded from this study. RT-PCR was performed in all patients included in the study. Results: Fever and dry cough were the most common clinical symptoms, detected in 80.16% and 52.00%, respectively. The most common comorbidities were cardiovascular diseases, followed by chronic lung disease and diabetes, found in 27.36%, 22.80%, and 18.00%, respectively. Of the 1500 RT-PCR-positive patients, 40% had CO-RADS score 5, while 3.4% had CO-RADS score 1. Of the 1000 RT-PCR-negative patients, 36% had CO-RADS score 2 and 1% were scored as CO-RADS 5. There was excellent agreement in the studied patients as the weighted κ value was 0.846, which was more pronounced at CO-RADS 5 (24.40%). The sensitivity of CO-RADS was higher in the 2nd scenario (83.27% vs. 55.27%) while the specificity was higher in the 1st scenario (95% vs. 65%). Conclusion: The CO-RADS scoring system is a sensitive and specific method that can help in the diagnosis of COVID-19 during the peak of the COVID-19 pandemic. CO-RADS is a triage test in resource-constrained environments, assisting in the optimization of RT-PCR tests, isolation beds, and intensive care units

    A comparative study between the effect of 17-β estradiol and antioxidants combination on some menopausal changes in oophorectomised rats

    No full text
    Background: As oxidative stress is proposed to be responsible for many of the menopause associated disorders, antioxidants may play an important role in this situation. The aim of this work was to compare between the effects of oestrogen replacement therapy and antioxidant supplements of vitamin C and low dose of vitamin A on some menopause associated changes in oophorectomised rats. Materials and methods: Forty albino female rats were divided into 4 groups: normal control group, oophorectomised group, oophorectomised group treated with 17-β estradiol (oophorectomised + E2) and oophorectomised group treated with vitamins (oophorectomised + vit).The following were measured: total antioxidant (TAO) and malondialdehyde (MDA), lipid profile, serum insulin, glucose and homeostasis model assessment-insulin resistance (HOMA-IR), bone specific alkaline phosphatase (BALP), urinary hydroxyproline, weight gain and visceral fat. Results: A positive correlation was found between MDA and low density lipoprotein-cholesterol (LDL) (r = 0.694 and P = 0.000), HOMA-IR (r = 0.691 and P = 0.000.) and BALP (r = 0.563 and P = 0.000) and urinary hydroxyproline level (r = 0.761 and P = 0.000). Those results denoted that OS might be a cause of dyslipidemia, insulin resistance and osteoporosis associated with menopause. Both E2 and vitamins in oophorectomised rats led to a significant decrease in MDA (F = 33.402, P = 0.000), weight gain, visceral fat (F = 7.589, p = 0.000 and F = 3.748, P = .019, respectively), cholesterol (F = 40.748, P = 0.0001), LDL cholesterol (F = 55.168, P = 0.0001), and significant increase in HDL (F = 18.393, P = 0.0001) and TAO levels (F = 14.781, P = 0.000) compared to oophorectomised rats. Also, both treatments led to a significant decrease of HOMA-IR (F = 18.933, P = 0.000, respectively), BALP (F = 13.202, P = 0.000) and urinary hydroxylproline (F = 220.012, P = 0.000). An interesting finding was detected where oophorectomised rats showed a decrease in triglyceride level which was significantly increased by E2 administration whereas antioxidant administration produced no change (F = 34.267, P = 0.0001). Conclusion: Our results denote similar effects of both E2 and antioxidant’ supplements (vitamin C and low dose vitamin A) administration in surgically induced menopause in rats regarding oxidative stress, weight gain, atherogenic lipid profile changes, insulin sensitivity and bone turnover. However differences between preclinical and clinical studies must be taken into consideration especially when moving from animal studies to clinical trials

    Prevalence of acute kidney injury in cardiac patients in the Intensive Care Unit

    No full text
    Background Acute kidney injury (AKI) has consistently been associated with adverse clinical outcome after acute myocardial infarction (MI). In addition, AKI is well-known as a potent predictor of the clinical course in heart failure patients. The aim of this study was to assess the prevalence and risk factors of AKI in patients with acute MI and congestive heart failure (CHF) in the ICU at Zagazig University Hospitals, Egypt. Patients and methods This study included 100 patients with acute MI and 100 patients with CHF admitted to the ICU. They were subjected to careful history taking, thorough clinical examination, ECG and echocardiographic evaluation, and laboratory investigations, including cardiac enzyme evaluation, renal profile, and fasting blood glucose. Definitions of AKI depend on the measurement of serum creatinine as a surrogate marker for the glomerular filtration rate, in addition to the calculation of estimated glomerular filtration rate. Results The proportion of patients who experienced AKI was 47% in patients with CHF and 45% in patients with acute MI. They were significantly older in age (P=0.013 and 0.004, respectively). In CHF, patients with AKI had significantly higher fasting blood sugar (P=0.011), abnormal ECG changes (P=0.001), lower ejection fraction (P=0.034), and lower diastolic dysfunction (P=0.027). However, in acute MI, patients with AKI had significantly higher fasting blood sugar (P=0.013) and higher troponin I level (P=0.015). Conclusion The most important risk factors for AKI in patients with CHF are older age, higher frequency of diabetes mellitus, abnormal ECG changes, lower ejection fraction, and diastolic dysfunction. However, high troponin I and older age are the most important risk factors for AKI in patients with acute MI. Careful monitoring of susceptible patients in the ICU is recommended for early detection and management of AKI in those patients
    corecore