4 research outputs found

    Novel Urinary Biomarkers and Chronic Kidney Disease After Coronary Angiography: A Prospective Case-Controlled Study

    Get PDF
    BACKGROUND: Novel urinary biomarkers may have potential for early detection of acute kidney injury. AIM: The aim of the study was to test two urinary biomarkers: Kidney injury molecule-1(KIM-1) and liver-type fatty acid binding protein (L-FABP) as markers of kidney injury following coronary angiography. METHODS: This is a prospective non-randomized controlled trial, performed in two large teaching hospitals. Patients were recruited from the catheter lab or form nephrology outpatient clinics. In group (A), 100 patients with AKI on top of CKD after coronary angiography and Group B: Thirty-one patients with stable CKD as a control. KIM-1 and L-FABP were measured at base line and after 3 months. RESULTS: In group (A), 100 patients who had acute on top of CKD after coronary angiography, stage progression occurred in 15 patients in group (A) compared to two patients in group (B) (p = 0.28). The median change in eGFR after 3 months was not statistically significant between both groups (p = 0.8). Median baseline urinary liver-type fatty acid binding protein was higher in Group A compared to Group B (3.7 μg/g vs. 1.82μg/g). The change in L-FABP from baseline to 3 months was significant between both groups (p < 0.001). The median urinary concentrations of KIM-1 and L-FABP were higher at the end of the follow-up compared to base line values in both groups, (p < 0.000). CONCLUSION: Urinary L-FABP correlates with kidney function decline in patients with acute on top of CKD after coronary angiography. Urinary levels of KIM-1 and L-FABP at 3 months increase significantly compared to baseline in patients with progressive CKD

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P < 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Impact of Enterobacteriaceae bacteremia on survival in patients with hepatorenal failure

    No full text
    Enterobacteriaceae are now the predominant pathogens isolated in patients with liver cell failure associated with bloodstream infections. We conducted a retrospective cohort study of patients who were admitted for the diagnosis of hepatorenal failure (HRF) between June 1999 and May 2008 to investigate the risk factors of Enterobacteriaceae bacteremia (EB). EB was defined as the isolation of an EB species from at least one blood culture within three months following diagnosis of HRF. Variables were collected from the medical records and analyzed in relation to EB. Twenty-four (32.5%) of the 73 patients developed EB. The origin of EB was abdominal in 21% of the patients, urinary in 12.5%, pulmonary in 16.5%, and primary in the remaining patients (50%). Two-thirds of EB occurred within 10 days following the development of HRF. The main pathogens were Escherichia coli (44%), Enterobacter species (20%) and Klebsiella pneumoniae (22%). Eighteen patients (75%) with EB died. Variables significantly associated with EB after multivariate analysis were a model for end-stage liver disease score >20 [odds ratio (OR): 2.84, P <0.02], posthepatitis B liver cirrhosis (OR: 4.72, P <0.05), posthepatitis C liver cirrhosis (OR: 3.48, P <0.05), and initial level of serum creatinine on admission to intensive care unit (OR: 2.56, P <0.02). EB is a frequent and severe complication of HRF. Patients with posthepatitis cirrhosis B and C, higher serum creatinine, and severe liver cell failure score have a high risk of developing EB
    corecore