70 research outputs found

    Association between HTR2C polymorphisms and weight loss in obese patients

    Get PDF
    OBJECTIVE To investigate whether the HTR2C rsUU334 and 759 C/T polymorphisms are associated with weight loss in an anti- obesity programme. DESIGN AND METHODS A longitudinal observational follow-up study was used to assess the association between HTR2C genotypes and weight loss during a nine month programme in an obesity clinic. Caucasian patients aged 18 years or older were included. Data were extracted from the patients' medical records. In total, 128 patients were included 129 males). RESULTS There was a significant association between the HTR2C 759 T allele and resistance to weight loss in the first month of the programme. For each T allele present, there was 0.78% (95% confidence interval [95%-CI] 0.19-1.38; P = 0.011 less weight loss (as a percentage of the body weight at start). Patients carrying the variant HTR2C 759 T allele were also less likely to reach > 7% weight loss (odds ratio [OR] 0.23; 95%-CI 0.06- 0,85; P = 0.028), and dropped out of the programme sooner [-0.78 months; 95%-CI -1.51- -0.06; P = 0.035; corrected for gender). No associations with the HTR2C rsUU334- genotype and any of the primary endpoints for weight loss or secondary endpoints were found. CONCLUSION Patients carrying the HTR2C 759 T allele were more resistant to weight loss and dropped out of the programme sooner. However, these effects were small and only explained a small part of a very complex puzzle. Genotyping HTR2C to predict a patient's chance of success in an obesity clinic is therefore not warranted

    Immune Reconstitution Kinetics as an Early Predictor for Mortality using Various Hematopoietic Stem Cell Sources in Children

    Get PDF
    AbstractThe severity of complications of allogeneic hematopoietic stem cell transplantation (HSCT) is governed mainly by the status of immune reconstitution. In this study, we investigated differences in immune reconstitution with different cell sources and the association between the kinetics of immune reconstitution and mortality. Immunophenotyping was performed every 2 weeks in children who had undergone HSCT between 2004 and 2008 at University Medical Center Utrecht. Lymphocyte reconstitution in the first 90 days after HSCT was studied in relation to mortality in 3 HSCT groups: matched sibling bone marrow (BM) recipients (35 patients), unrelated BM recipients (32 patients), and unrelated cord blood recipients (36 patients). The median age of recipients was 5.9 years (range, 0.1-21 years). The nature and speed of T cell, B cell, and natural killer (NK) cell reconstitution were highly dependent on the cell source. In the first 90 days after HSCT, faster B cell and NK cell reconstitution and delayed T cell reconstitution were shown in unrelated cord blood recipients compared with matched sibling BM and unrelated BM recipients. Of the lymphocyte subsets investigated, a large number of NK cells and a more rapid CD4+ immune reconstitution over time, resulting in sustained higher CD4+ counts, were the only predictors of a lower mortality risk in all cell sources. The final model showed that during the first 90 days, patients with an area under the CD4+ cell receiver- operating curve of >4,300 cells/day and no peak in CD4+ cell counts had the highest likelihood of survival (hazard ratio for mortality, 0.2; 95% confidence interval, 0.06-0.5). Our data indicate that CD4+ kinetics may be used to identify patients at greatest risk for mortality early after HSCT

    Contextualized Drug–Drug Interaction Management Improves Clinical Utility Compared With Basic Drug–Drug Interaction Management in Hospitalized Patients

    Get PDF
    Drug–drug interactions (DDIs) frequently trigger adverse drug events or reduced efficacy. Most DDI alerts, however, are overridden because of irrelevance for the specific patient. Basic DDI clinical decision support (CDS) systems offer limited possibilities for decreasing the number of irrelevant DDI alerts without missing relevant ones. Computerized decision tree rules were designed to context-dependently suppress irrelevant DDI alerts. A crossover study was performed to compare the clinical utility of contextualized and basic DDI management in hospitalized patients. First, a basic DDI-CDS system was used in clinical practice while contextualized DDI alerts were collected in the background. Next, this process was reversed. All medication orders (MOs) from hospitalized patients with at least one DDI alert were included. The following outcome measures were used to assess clinical utility: positive predictive value (PPV), negative predictive value (NPV), number of pharmacy interventions (PIs)/1,000 MOs, and the median time spent on DDI management/1,000 MOs. During the basic DDI management phase 1,919 MOs/day were included, triggering 220 DDI alerts/1,000 MOs; showing 57 basic DDI alerts/1,000 MOs to pharmacy staff; PPV was 2.8% with 1.6 PIs/1,000 MOs costing 37.2 minutes/1,000 MOs. No DDIs were missed by the contextualized CDS system (NPV 100%). During the contextualized DDI management phase 1,853 MOs/day were included, triggering 244 basic DDI alerts/1,000 MOs, showing 9.6 contextualized DDIs/1,000 MOs to pharmacy staff; PPV was 41.4% (P &lt; 0.01), with 4.0 PIs/1,000 MOs (P &lt; 0.01) and 13.7 minutes/1,000 MOs. The clinical utility of contextualized DDI management exceeds that of basic DDI management.</p

    The efficacy of the entire-vial dosing of emicizumab: Real-world evidence on plasma concentrations, bleeds, and drug waste

    Get PDF
    Background: Prophylaxis with emicizumab provides effective bleeding protection in persons with hemophilia A (PwHA) but pressures healthcare budgets. The body weight–adjusted dosing at 7-, 14-, or 28-day intervals, according to the label, often mismatches the vial content. Entire-vial dosing resulted in therapeutic concentrations according to pharmacokinetic simulations and was introduced to avoid waste. Objectives: The objective of this study was to evaluate the efficacy of entire-vial dosing of emicizumab by investigating real-world evidence of plasma concentrations, bleeds, and drug waste. Methods: This is a single-center, observational study with PwHA receiving emicizumab in mg/kg doses according to label but dosing interval extrapolated to the nearest vial size. Patient characteristics and bleeds were compared 1 year before starting emicizumab and during emicizumab until January 2022. Concentrations were assessed at weeks 4, 12, and annually. The mean (95% CI) annualized bleed rates were compared by using negative binomial regression. Drug waste between label-based dosing and entire-vial dosing was compared. Results: A total of 112 individuals (94% severe phenotype and 9% positive FVIII inhibitors) were followed for a median of 56 weeks (interquartile range [IQR] 52-68) before and 51 weeks (IQR 29-75) after starting emicizumab. The median emicizumab dose was 5.9 (IQR 5.5-6.2) mg/kg/4 wk with median concentrations of 63 (IQR 51-80) μg/mL. The annualized bleed rate of treated bleeds before emicizumab was 3.6 (95% CI 2.9-4.4) and was 0.8 (95% CI 0.6-1.1) during emicizumab (P < .001). Drug waste was reduced by 9%. Conclusion: The entire-vial dosing of emicizumab is an attractive treatment option for PwHA leading to therapeutic plasma concentrations, good bleeding control, and drug waste avoidance

    Contextualized Drug–Drug Interaction Management Improves Clinical Utility Compared With Basic Drug–Drug Interaction Management in Hospitalized Patients

    Get PDF
    Drug–drug interactions (DDIs) frequently trigger adverse drug events or reduced efficacy. Most DDI alerts, however, are overridden because of irrelevance for the specific patient. Basic DDI clinical decision support (CDS) systems offer limited possibilities for decreasing the number of irrelevant DDI alerts without missing relevant ones. Computerized decision tree rules were designed to context-dependently suppress irrelevant DDI alerts. A crossover study was performed to compare the clinical utility of contextualized and basic DDI management in hospitalized patients. First, a basic DDI-CDS system was used in clinical practice while contextualized DDI alerts were collected in the background. Next, this process was reversed. All medication orders (MOs) from hospitalized patients with at least one DDI alert were included. The following outcome measures were used to assess clinical utility: positive predictive value (PPV), negative predictive value (NPV), number of pharmacy interventions (PIs)/1,000 MOs, and the median time spent on DDI management/1,000 MOs. During the basic DDI management phase 1,919 MOs/day were included, triggering 220 DDI alerts/1,000 MOs; showing 57 basic DDI alerts/1,000 MOs to pharmacy staff; PPV was 2.8% with 1.6 PIs/1,000 MOs costing 37.2 minutes/1,000 MOs. No DDIs were missed by the contextualized CDS system (NPV 100%). During the contextualized DDI management phase 1,853 MOs/day were included, triggering 244 basic DDI alerts/1,000 MOs, showing 9.6 contextualized DDIs/1,000 MOs to pharmacy staff; PPV was 41.4% (P &lt; 0.01), with 4.0 PIs/1,000 MOs (P &lt; 0.01) and 13.7 minutes/1,000 MOs. The clinical utility of contextualized DDI management exceeds that of basic DDI management.</p

    Performance of a trigger tool for detecting adverse drug reactions in patients with polypharmacy acutely admitted to the geriatric ward

    Get PDF
    Key summary pointsAim To investigate the performance of an adverse drug reaction (ADR) trigger tool in patients with polypharmacy acutely admitted to our geriatric ward. Findings The ADR trigger tool had a positive predictive value (PPV) of 41.8%. Usual care recognised 83.5% of ADRs considered as possible, probable or certain, increasing to 97.1% when restricted to probable and certain ADRs. Message It is unlikely that implementation of the ADR trigger tool will improve detection of unrecognised ADRs in older patients acutely admitted to our geriatric ward.Purpose Adverse drug reactions (ADRs) account for 10% of acute hospital admissions in older people, often under-recognised by physicians. The Dutch geriatric guideline recommends screening all acutely admitted older patients with polypharmacy with an ADR trigger tool comprising ten triggers and associated drugs frequently causing ADRs. This study investigated the performance of this tool and the recognition by usual care of ADRs detected with the tool. Methods A cross-sectional study was performed in patients >= 70 years with polypharmacy acutely admitted to the geriatric ward of the University Medical Centre Utrecht. Electronic health records (EHRs) were screened for trigger-drug combinations listed in the ADR trigger tool. Two independent appraisers assessed causal probability with the WHO-UMC algorithm and screened EHRs for recognition of ADRs by attending physicians. Performance of the tool was defined as the positive predictive value (PPV) for ADRs with a possible, probable or certain causal relation. Results In total, 941 trigger-drug combinations were present in 73% (n = 253/345) of the patients. The triggers fall, delirium, renal insufficiency and hyponatraemia covered 86% (n = 810/941) of all trigger-drug combinations. The overall PPV was 41.8% (n = 393/941), but the PPV for individual triggers was highly variable ranging from 0 to 100%. Usual care recognised the majority of ADRs (83.5%), increasing to 97.1% when restricted to possible and certain ADRs. Conclusion The ADR trigger tool has predictive value; however, its implementation is unlikely to improve the detection of unrecognised ADRs in older patients acutely admitted to our geriatric ward. Future research is needed to investigate the tool's clinical value when applied to older patients acutely admitted to non-geriatric wards

    Performance of a trigger tool for detecting adverse drug reactions in patients with polypharmacy acutely admitted to the geriatric ward

    Get PDF
    PURPOSE: Adverse drug reactions (ADRs) account for 10% of acute hospital admissions in older people, often under-recognised by physicians. The Dutch geriatric guideline recommends screening all acutely admitted older patients with polypharmacy with an ADR trigger tool comprising ten triggers and associated drugs frequently causing ADRs. This study investigated the performance of this tool and the recognition by usual care of ADRs detected with the tool. METHODS: A cross-sectional study was performed in patients ≥ 70 years with polypharmacy acutely admitted to the geriatric ward of the University Medical Centre Utrecht. Electronic health records (EHRs) were screened for trigger-drug combinations listed in the ADR trigger tool. Two independent appraisers assessed causal probability with the WHO-UMC algorithm and screened EHRs for recognition of ADRs by attending physicians. Performance of the tool was defined as the positive predictive value (PPV) for ADRs with a possible, probable or certain causal relation. RESULTS: In total, 941 trigger-drug combinations were present in 73% (n = 253/345) of the patients. The triggers fall, delirium, renal insufficiency and hyponatraemia covered 86% (n = 810/941) of all trigger-drug combinations. The overall PPV was 41.8% (n = 393/941), but the PPV for individual triggers was highly variable ranging from 0 to 100%. Usual care recognised the majority of ADRs (83.5%), increasing to 97.1% when restricted to possible and certain ADRs. CONCLUSION: The ADR trigger tool has predictive value; however, its implementation is unlikely to improve the detection of unrecognised ADRs in older patients acutely admitted to our geriatric ward. Future research is needed to investigate the tool's clinical value when applied to older patients acutely admitted to non-geriatric wards

    Drug waste of ready-to-administer syringes in the intensive care unit: Aseptically prepared syringes versus prefilled sterilized syringes

    Get PDF
    Background: The availability of ready-to-administer (RTA) syringes for intravenous (IV) drugs facilitates rapid and safe administration in emergency and intensive care situations. Hospital pharmacies can prepare RTA syringes through aseptic batchwise filling. Due to excess production of these RTA syringes for sufficient availability for patient care and their limited (microbiological) shelf-life, waste is unavoidable, which contributes to environmental pollution. RTA prefilled sterilized syringes (PFSSs) have much longer shelf-lives than aseptically prepared RTA syringes and might contribute to reducing drug waste. Aim: This study aimed to evaluate the difference in drug waste between RTA syringes that were prepared through aseptic batchwise filling and RTA PFSSs in the Intensive Care Unit (ICU). Methods: We measured drug waste of RTA syringes over an 8-year time period from August 2015 to May 2023 in the 32-bed ICU of the University Medical Center Utrecht. We distinguished between RTA syringes prepared through aseptic batchwise filling by our hospital pharmacy (“RTA aseptic syringes”, shelf-life of 31 days) and RTA PFSSs (shelf-life of 18 months). An intervention group of three drug products that were replaced by PFSSs was compared to a control group of five drug products that were not replaced by PFSSs during the study period. We then defined four different periods within the total study period, based on quarantine time of the RTA aseptic syringes and time of PFSS introduction: 1) no quarantine, 2) 3-day quarantine, 3) 7-day quarantine and 4) PFSS introduction. Our primary endpoint was the number of RTA syringes that was wasted, expressed as the percentage of the total number of syringes dispensed to the ICU in each of these four periods. We used a Kruskall-Wallis test to test if waste percentages differed between time periods in the control and intervention groups, with a post-hoc Dunn's test for pairwise comparisons. Furthermore, we applied two interrupted time series (ITS) analyses to visualize and test the effect of introducing different quarantine times and the PFSSs on waste percentage. Results: Introduction of PFSSs significantly decreased drug waste of RTA syringes irrespective of drug type in the intervention group, from 31% during the 7-day quarantine period to 5% after introduction of the PFSS (p<0.001). The control group showed no significant decrease in drug waste over the same time periods (from 20% to 16%; p=0.726). We observed a significant difference in the total drug waste of RTA aseptic syringes between time periods, which may be attributed to the implementation of different quality control quarantine procedures. The ITS model of the intervention group showed a direct decrease of 17.7% in waste percentage after the introduction of PFSSs (p=0.083). Conclusion: Drug waste of RTA syringes for the ICU can be significantly decreased by introducing PFSSs, supporting hospitals to enhance environmental sustainability. Furthermore, the waste percentage of RTA syringes prepared through aseptic batchwise filling is significantly impacted by duration of quarantine time

    Contextualized Drug–Drug Interaction Management Improves Clinical Utility Compared With Basic Drug–Drug Interaction Management in Hospitalized Patients

    Get PDF
    Drug–drug interactions (DDIs) frequently trigger adverse drug events or reduced efficacy. Most DDI alerts, however, are overridden because of irrelevance for the specific patient. Basic DDI clinical decision support (CDS) systems offer limited possibilities for decreasing the number of irrelevant DDI alerts without missing relevant ones. Computerized decision tree rules were designed to context-dependently suppress irrelevant DDI alerts. A crossover study was performed to compare the clinical utility of contextualized and basic DDI management in hospitalized patients. First, a basic DDI-CDS system was used in clinical practice while contextualized DDI alerts were collected in the background. Next, this process was reversed. All medication orders (MOs) from hospitalized patients with at least one DDI alert were included. The following outcome measures were used to assess clinical utility: positive predictive value (PPV), negative predictive value (NPV), number of pharmacy interventions (PIs)/1,000 MOs, and the median time spent on DDI management/1,000 MOs. During the basic DDI management phase 1,919 MOs/day were included, triggering 220 DDI alerts/1,000 MOs; showing 57 basic DDI alerts/1,000 MOs to pharmacy staff; PPV was 2.8% with 1.6 PIs/1,000 MOs costing 37.2 minutes/1,000 MOs. No DDIs were missed by the contextualized CDS system (NPV 100%). During the contextualized DDI management phase 1,853 MOs/day were included, triggering 244 basic DDI alerts/1,000 MOs, showing 9.6 contextualized DDIs/1,000 MOs to pharmacy staff; PPV was 41.4% (P < 0.01), with 4.0 PIs/1,000 MOs (P < 0.01) and 13.7 minutes/1,000 MOs. The clinical utility of contextualized DDI management exceeds that of basic DDI management

    Detectability of Medication Errors With a STOPP/START-Based Medication Review in Older People Prior to a Potentially Preventable Drug-Related Hospital Admission.

    Get PDF
    INTRODUCTION Multimorbidity and polypharmacy are risk factors for drug-related hospital admissions (DRAs) in the ageing population. DRAs caused by medication errors (MEs) are considered potentially preventable. The STOPP/START criteria were developed to detect potential MEs in older people. OBJECTIVE The aim of this study was to assess the detectability of MEs with a STOPP/START-based in-hospital medication review in older people with polypharmacy and multimorbidity prior to a potentially preventable DRA. METHODS Hospitalised older patients (n = 963) with polypharmacy and multimorbidity from the intervention arm of the OPERAM trial received a STOPP/START-based in-hospital medication review by a pharmacotherapy team. Readmissions within 1 year after the in-hospital medication review were adjudicated for drug-relatedness. A retrospective assessment was performed to determine whether MEs identified at the first DRA were detectable during the in-hospital medication review. RESULTS In total, 84 of 963 OPERAM intervention patients (8.7%) were readmitted with a potentially preventable DRA, of which 72 patients (n = 77 MEs) were eligible for analysis. About half (48%, n = 37/77) of the MEs were not present during the in-hospital medication review and therefore were not detectable at that time. The pharmacotherapy team recommended a change in medication regimen in 50% (n = 20/40) of present MEs, which corresponds to 26% (n = 20/77) of the total identified MEs at readmission. However, these recommendations were not implemented. CONCLUSION MEs identified at readmission were not addressed by a prior single in-hospital medication review because either these MEs occurred after the medication review (~50%), or no recommendation was given during the medication review (~25%), or the recommendation was not implemented (~25%). Future research should focus on optimisation of the timing and frequency of medication review and the implementation of proposed medication recommendations. REGISTRATION ClinicalTrials.gov identifier: NCT02986425. December 8, 2016. FUNDING European Union HORIZON 2020, Swiss State Secretariat for Education, Research and Innovation (SERI), Swiss National Science Foundation (SNSF)
    • …
    corecore