185 research outputs found

    Medication discrepancies in late-stage chronic kidney disease.

    Get PDF
    Background: Late-stage chronic kidney disease (LS-CKD) can be defined by glomerular filtration rate (GFR) 0-30 mL/min. It is a period of risk for medication discrepancies because of frequent hospitalizations, fragmented medical care, inadequate communication and polypharmacy. In this study, we sought to characterize medication discrepancies in LS-CKD. Methods: We analyzed all patients enrolled in Northwell Health\u27s Healthy Transitions in LS-CKD program. All patients had estimated GFR 0-30 mL/min, not on dialysis. Medications were reviewed by a nurse at a home visit. Patients\u27 medication usage and practice were compared with nephrologists\u27 medication lists, and discrepancies were characterized. Patients were categorized as having either no discrepancies or one or more. Associations between patient characteristics and number of medication discrepancies were evaluated by chi-square or Fisher\u27s exact test for categorical variables, and two-sample Results: Seven hundred and thirteen patients with a median age of 70 (interquartile range 58-79) years were studied. There were 392 patients (55.0% of the study population) with at least one medication discrepancy. The therapeutic classes of medications with most frequently occurring medication discrepancies were cardiovascular, vitamins, bone and mineral disease agents, diuretics, analgesics and diabetes medications. In multivariable analysis, factors associated with higher risk of discrepancies were congestive heart failure [odds ratio (OR) 2.13; 95% confidence interval (CI) 1.44-3.16; P = 0.0002] and number of medications (OR 1.29; 95% CI 1.21-1.37; P \u3c 0.0001). Conclusions: Medication discrepancies are common in LS-CKD, affect the majority of patients and include high-risk medication classes. Congestive heart failure and total number of medications are independently associated with greater risk for multiple drug discrepancies. The frequency of medication discrepancies indicates a need for great care in medication management of these patients

    Conversion to sirolimus for chronic renal allograft dysfunction: risk factors for graft loss and severe side effects

    Get PDF
    We retrospectively reviewed our experience with 45 kidney transplant recipients (KTR) that were switched from CNI to SRL, mainly for chronic allograft dysfunction (CAD) (41/45). The mean serum creatinine at switch was 2.5 ± 0.8 mg/dl. At 1 year, patient survival was 93%. Death-censored graft survival was 67% at 1 year and 54% at 2 years. SRL was stopped because of severe side effects in 15 patients. Among these, eight patients developed ‘de novo’ high-grade proteinuria. Univariate analysis revealed that (1) a higher SRL level at 1 month was a predictor of SRL withdrawal due to severe side effects (P = 0.006), and (2) predictors of graft failure after SRL conversion were low SRL loading dose (P = 0.03) and a higher creatinine level at conversion (P = 0.003)

    Extracorporeal photopheresis for the treatment of graft rejection in 33 adult kidney transplant recipients

    Get PDF
    Background - Extracorporeal photopheresis (ECP) has shown encouraging results in the prevention of allograft rejection in heart transplantation. However, the role of ECP in kidney transplant (KT) rejection needs to be determined. Methods - This multicentre retrospective study included 33 KT recipients who were treated with ECP for allograft rejection (23 acute antibody-mediated rejections (AMRs), 2 chronic AMRs and 8 acute cellular rejections (ACRs)). The ECP indications were KT rejection in patients who were resistant to standard therapies (n = 18) or in patients for whom standard therapies were contraindicated because of concomitant infections or cancers (n = 15). Results - At 12 months (M12) post-ECP, 11 patients (33%) had a stabilization of kidney function with a graft survival rate of 61%. The Banff AMR score (g + ptc + v) was a risk factor for graft loss at M12 (HR 1.44 [1.01-2.05], p < 0.05). The factorial mixed data analysis identified 2 clusters. Patients with a functional graft at M12 tended to have cellular and/or chronic rejections. Patients with graft loss at M12 tended to have acute rejections and/or AMR; higher serum creatinine levels; DSA levels and histologic scores of AMR; and a longer delay between the rejection and ECP start than those of patients with functional grafts. Conclusions - ECP may be helpful to control ACR or moderate AMR in KT recipients presenting concomitant opportunistic infections or malignancies when it is initiated early

    Exploring the theoretical foundations of visual art programmes for people living with dementia

    Get PDF
    Despite the growing international innovations for visual arts interventions in dementia care, limited attention has been paid to their theoretical basis. In response, this paper explores how and why visual art interventions in dementia care influence changes in outcomes. The theory building process consists of a realist review of primary research on visual art programmes. This aims to uncover what works, for whom, how, why and in what circumstances. We undertook a qualitative exploration of stakeholder perspectives of art programmes, and then synthesised these two pieces of work alongside broader theory to produce a conceptual framework for intervention development, further research and practice. This suggests effective programmes are realised through essential attributes of two key conditions (provocative and stimulating aesthetic experience; dynamic and responsive artistic practice). These conditions are important for cognitive, social and individual responses, leading to benefits for people with early to more advanced dementia. This work represents a starting point at identifying theories of change for arts interventions, and for further research to critically examine, refine and strengthen the evidence base for the arts in dementia care. Understanding the theoretical basis of interventions is important for service development, evaluation and implementation

    Acute kidney injury in patients hospitalized with COVID-19

    Get PDF
    © 2020 International Society of Nephrology The rate of acute kidney injury (AKI) associated with patients hospitalized with Covid-19, and associated outcomes are not well understood. This study describes the presentation, risk factors and outcomes of AKI in patients hospitalized with Covid-19. We reviewed the health records for all patients hospitalized with Covid-19 between March 1, and April 5, 2020, at 13 academic and community hospitals in metropolitan New York. Patients younger than 18 years of age, with end stage kidney disease or with a kidney transplant were excluded. AKI was defined according to KDIGO criteria. Of 5,449 patients admitted with Covid-19, AKI developed in 1,993 (36.6%). The peak stages of AKI were stage 1 in 46.5%, stage 2 in 22.4% and stage 3 in 31.1%. Of these, 14.3% required renal replacement therapy (RRT). AKI was primarily seen in Covid-19 patients with respiratory failure, with 89.7% of patients on mechanical ventilation developing AKI compared to 21.7% of non-ventilated patients. 276/285 (96.8%) of patients requiring RRT were on ventilators. Of patients who required ventilation and developed AKI, 52.2% had the onset of AKI within 24 hours of intubation. Risk factors for AKI included older age, diabetes mellitus, cardiovascular disease, black race, hypertension and need for ventilation and vasopressor medications. Among patients with AKI, 694 died (35%), 519 (26%) were discharged and 780 (39%) were still hospitalized. AKI occurs frequently among patients with Covid-19 disease. It occurs early and in temporal association with respiratory failure and is associated with a poor prognosis

    Global 30-day outcomes after bariatric surgery during the COVID-19 pandemic (GENEVA): an international cohort study

    Get PDF

    Micromechanical Properties of Injection-Molded Starch–Wood Particle Composites

    Get PDF
    The micromechanical properties of injection molded starch–wood particle composites were investigated as a function of particle content and humidity conditions. The composite materials were characterized by scanning electron microscopy and X-ray diffraction methods. The microhardness of the composites was shown to increase notably with the concentration of the wood particles. In addition,creep behavior under the indenter and temperature dependence were evaluated in terms of the independent contribution of the starch matrix and the wood microparticles to the hardness value. The influence of drying time on the density and weight uptake of the injection-molded composites was highlighted. The results revealed the role of the mechanism of water evaporation, showing that the dependence of water uptake and temperature was greater for the starch–wood composites than for the pure starch sample. Experiments performed during the drying process at 70°C indicated that the wood in the starch composites did not prevent water loss from the samples.Peer reviewe
    • 

    corecore