30 research outputs found

    The risk of delirium after sedation with propofol or midazolam in intensive care unit patients

    Get PDF
    AIM: Knowledge of risk factors may provide strategies to reduce the high burden of delirium in intensive care unit (ICU) patients. We aimed to compare the risk of delirium after deep sedation with propofol versus midazolam in ICU patients. METHODS: In this prospective cohort study, ICU patients who were in an unarousable state for ≥24 h due to continuous sedation with propofol and/or midazolam were included. Patients admitted ≤24 h, those with an acute neurological disorder and those receiving palliative sedation were excluded. ICU patients were assessed daily for delirium during the 7 days following an unarousable state due to continuous sedation. RESULTS: Among 950 included patients, 605 (64%) subjects were delirious during the 7 days after awaking. The proportion of subsequent delirium was higher after midazolam sedation (152/207 [73%] patients) and after both propofol and midazolam sedation (257/377 [68%] patients), compared to propofol sedation only (196/366 [54%] patients). Midazolam sedation (adjusted cause-specific hazard ratio [adj. cause-specific HR] 1.32, 95% confidence interval [CI] 1.05-1.66) and propofol and midazolam sedation (adj. cause-specific HR 1.29, 95% CI 1.06-1.56) were associated with a higher risk of subsequent delirium compared to propofol sedation only. CONCLUSION: This study among sedated ICU patients suggests that, compared to propofol sedation, midazolam sedation is associated with a higher risk of subsequent delirium. This risk seems more apparent in patients with high cumulative midazolam intravenous doses. Our findings underpin the recommendations of the Society of Critical Care Medicine Pain, Agitation/sedation, Delirium, Immobility (rehabilitation/mobilization), and Sleep (disruption) guidelines to use propofol over benzodiazepines for sedation in ICU patients

    Drug waste of ready-to-administer syringes in the intensive care unit: Aseptically prepared syringes versus prefilled sterilized syringes

    Get PDF
    Background: The availability of ready-to-administer (RTA) syringes for intravenous (IV) drugs facilitates rapid and safe administration in emergency and intensive care situations. Hospital pharmacies can prepare RTA syringes through aseptic batchwise filling. Due to excess production of these RTA syringes for sufficient availability for patient care and their limited (microbiological) shelf-life, waste is unavoidable, which contributes to environmental pollution. RTA prefilled sterilized syringes (PFSSs) have much longer shelf-lives than aseptically prepared RTA syringes and might contribute to reducing drug waste. Aim: This study aimed to evaluate the difference in drug waste between RTA syringes that were prepared through aseptic batchwise filling and RTA PFSSs in the Intensive Care Unit (ICU). Methods: We measured drug waste of RTA syringes over an 8-year time period from August 2015 to May 2023 in the 32-bed ICU of the University Medical Center Utrecht. We distinguished between RTA syringes prepared through aseptic batchwise filling by our hospital pharmacy (“RTA aseptic syringes”, shelf-life of 31 days) and RTA PFSSs (shelf-life of 18 months). An intervention group of three drug products that were replaced by PFSSs was compared to a control group of five drug products that were not replaced by PFSSs during the study period. We then defined four different periods within the total study period, based on quarantine time of the RTA aseptic syringes and time of PFSS introduction: 1) no quarantine, 2) 3-day quarantine, 3) 7-day quarantine and 4) PFSS introduction. Our primary endpoint was the number of RTA syringes that was wasted, expressed as the percentage of the total number of syringes dispensed to the ICU in each of these four periods. We used a Kruskall-Wallis test to test if waste percentages differed between time periods in the control and intervention groups, with a post-hoc Dunn's test for pairwise comparisons. Furthermore, we applied two interrupted time series (ITS) analyses to visualize and test the effect of introducing different quarantine times and the PFSSs on waste percentage. Results: Introduction of PFSSs significantly decreased drug waste of RTA syringes irrespective of drug type in the intervention group, from 31% during the 7-day quarantine period to 5% after introduction of the PFSS (p<0.001). The control group showed no significant decrease in drug waste over the same time periods (from 20% to 16%; p=0.726). We observed a significant difference in the total drug waste of RTA aseptic syringes between time periods, which may be attributed to the implementation of different quality control quarantine procedures. The ITS model of the intervention group showed a direct decrease of 17.7% in waste percentage after the introduction of PFSSs (p=0.083). Conclusion: Drug waste of RTA syringes for the ICU can be significantly decreased by introducing PFSSs, supporting hospitals to enhance environmental sustainability. Furthermore, the waste percentage of RTA syringes prepared through aseptic batchwise filling is significantly impacted by duration of quarantine time

    Drug waste of ready-to-administer syringes in the intensive care unit: Aseptically prepared syringes versus prefilled sterilized syringes

    Get PDF
    Background: The availability of ready-to-administer (RTA) syringes for intravenous (IV) drugs facilitates rapid and safe administration in emergency and intensive care situations. Hospital pharmacies can prepare RTA syringes through aseptic batchwise filling. Due to excess production of these RTA syringes for sufficient availability for patient care and their limited (microbiological) shelf-life, waste is unavoidable, which contributes to environmental pollution. RTA prefilled sterilized syringes (PFSSs) have much longer shelf-lives than aseptically prepared RTA syringes and might contribute to reducing drug waste. Aim: This study aimed to evaluate the difference in drug waste between RTA syringes that were prepared through aseptic batchwise filling and RTA PFSSs in the Intensive Care Unit (ICU). Methods: We measured drug waste of RTA syringes over an 8-year time period from August 2015 to May 2023 in the 32-bed ICU of the University Medical Center Utrecht. We distinguished between RTA syringes prepared through aseptic batchwise filling by our hospital pharmacy (“RTA aseptic syringes”, shelf-life of 31 days) and RTA PFSSs (shelf-life of 18 months). An intervention group of three drug products that were replaced by PFSSs was compared to a control group of five drug products that were not replaced by PFSSs during the study period. We then defined four different periods within the total study period, based on quarantine time of the RTA aseptic syringes and time of PFSS introduction: 1) no quarantine, 2) 3-day quarantine, 3) 7-day quarantine and 4) PFSS introduction. Our primary endpoint was the number of RTA syringes that was wasted, expressed as the percentage of the total number of syringes dispensed to the ICU in each of these four periods. We used a Kruskall-Wallis test to test if waste percentages differed between time periods in the control and intervention groups, with a post-hoc Dunn's test for pairwise comparisons. Furthermore, we applied two interrupted time series (ITS) analyses to visualize and test the effect of introducing different quarantine times and the PFSSs on waste percentage. Results: Introduction of PFSSs significantly decreased drug waste of RTA syringes irrespective of drug type in the intervention group, from 31% during the 7-day quarantine period to 5% after introduction of the PFSS (p<0.001). The control group showed no significant decrease in drug waste over the same time periods (from 20% to 16%; p=0.726). We observed a significant difference in the total drug waste of RTA aseptic syringes between time periods, which may be attributed to the implementation of different quality control quarantine procedures. The ITS model of the intervention group showed a direct decrease of 17.7% in waste percentage after the introduction of PFSSs (p=0.083). Conclusion: Drug waste of RTA syringes for the ICU can be significantly decreased by introducing PFSSs, supporting hospitals to enhance environmental sustainability. Furthermore, the waste percentage of RTA syringes prepared through aseptic batchwise filling is significantly impacted by duration of quarantine time

    Busulfan target exposure attainment in children undergoing allogeneic hematopoietic cell transplantation: a single day versus a multiday therapeutic drug monitoring regimen

    Get PDF
    Busulfan exposure has previously been linked to clinical outcomes, hence the need for therapeutic drug monitoring (TDM). Study objective was to evaluate the effect of day 1 TDM-guided dosing (regimen d1) versus days 1 + 2 TDM-guided dosing (regimen d1 + 2) on attaining adequate busulfan exposure. In this observational study, we included all children receiving busulfan-based allogeneic hematopoietic cell transplantation. Primary outcome was the percentage of patients achieving busulfan target attainment in both TDM regimens. Secondary outcomes were the variance in busulfan exposure and day-4 clearance (Clday4) estimates between both TDM regimens and dosing day 1 and 2. In regimen d1, 84.3% (n = 91/108) attained a therapeutic busulfan exposure, while in regimen d1 + 2 a proportion of 90.9% was found (n = 30/33, not-significant). Variance of Clday4 estimate based on busulfan day 2 concentrations was significantly smaller than the variance of Clday4 estimates based on day 1 concentrations (p < 0.001). Therefore, day 1-guided TDM (pharmacometric model-based) of busulfan may be sufficient for attaining optimal target exposure, provided that subsequent TDM is carried out if required. However, performing TDM on subsequent days may be beneficial, as measurements on day 2 seemed to reduce the variance in the estimated clearance as compared to day 1 sampling

    Severity of diabetes mellitus and risk of total hip or knee replacement: A population based case-control study

    No full text
    Background: It is generally thought that people with diabetes mellitus (DM) are more likely to suffer from OA due to an increased Body Mass Index (BMI), resulting in the mechanical destruction of cartilage. However, previous studies have shown that DM could also be an independent risk factor for OA, suggesting other causative factors are involved (Nieves-Plaza M, 2013; Schett G, 2013). Objectives: To evaluate the risk of hip or knee replacement, as a proxy for severe osteoarthritis (OA), in patients with diabetes mellitus (DM) compared to non-diabetic patients. We additionally evaluated the risk of total joint replacement (TJR) with various proxies for increased DM severity. Methods: We performed a population based case-control study using the Clinical Practice Research Datalink (CPRD). Cases (n=94,609) were defined as patients >18 years who had undergone TJR between 2000 and 2012. Controls were matched by age, gender and general practice. Conditional logistic regression was used to estimate the risk of total knee (TKR) and total hip replacement (THR) surgery associated with use of antidiabetic drugs (ADs). We additionally stratified current AD users by proxies for DM severity. Results: Current AD use was significantly associated with a lower risk of TKR (OR=0.86 (95% CI=0.78-0.94)) and THR (OR=0.90 (95% CI=0.82-0.99)) compared to patients not using ADs. Moreover, risk of TKR and THR was decreased with increasing HbA1c. Conclusions: In contrast to previous research, this study does not support the hypothesis that diabetic patients are more likely to suffer from severe OA as compared to non-diabetic patients. This is possibly due to methodological and medical dissimilarities between studies

    Severity of diabetes mellitus and risk of total hip or knee replacement: A population based case-control study

    No full text
    Background: It is generally thought that people with diabetes mellitus (DM) are more likely to suffer from OA due to an increased Body Mass Index (BMI), resulting in the mechanical destruction of cartilage. However, previous studies have shown that DM could also be an independent risk factor for OA, suggesting other causative factors are involved (Nieves-Plaza M, 2013; Schett G, 2013). Objectives: To evaluate the risk of hip or knee replacement, as a proxy for severe osteoarthritis (OA), in patients with diabetes mellitus (DM) compared to non-diabetic patients. We additionally evaluated the risk of total joint replacement (TJR) with various proxies for increased DM severity. Methods: We performed a population based case-control study using the Clinical Practice Research Datalink (CPRD). Cases (n=94,609) were defined as patients >18 years who had undergone TJR between 2000 and 2012. Controls were matched by age, gender and general practice. Conditional logistic regression was used to estimate the risk of total knee (TKR) and total hip replacement (THR) surgery associated with use of antidiabetic drugs (ADs). We additionally stratified current AD users by proxies for DM severity. Results: Current AD use was significantly associated with a lower risk of TKR (OR=0.86 (95% CI=0.78-0.94)) and THR (OR=0.90 (95% CI=0.82-0.99)) compared to patients not using ADs. Moreover, risk of TKR and THR was decreased with increasing HbA1c. Conclusions: In contrast to previous research, this study does not support the hypothesis that diabetic patients are more likely to suffer from severe OA as compared to non-diabetic patients. This is possibly due to methodological and medical dissimilarities between studies

    Influence of allopurinol on thiopurine associated toxicity: A retrospective population-based cohort study

    Get PDF
    Aims: Thiopurines are important for treating inflammatory bowel disease, but are often discontinued due to adverse effects. Concomitant use of allopurinol might lower the risk of these unwanted effects, but large studies in the general population are lacking. The aims of this study were to evaluate rates of hepatotoxicity, myelotoxicity, pancreas toxicity and therapy persistence in adult thiopurine users with or without allopurinol. Methods: A retrospective population-based cohort study was conducted within current thiopurine users (Clinical Practice Research Datalink). Among these patients, co-use of allopurinol was compared to non-use. Hazard ratios (HRs) for hepatotoxicity, myelotoxicity and pancreatitis were derived using time-dependent Cox proportional hazards models, and were adjusted for potential confounders. Persistence of thiopurine use was evaluated using Log-rank statistics. Results: Patients using thiopurines (n = 37 360) were identified of which 1077 were concomitantly taking allopurinol. A 58% decreased risk of hepatotoxicity was observed in those concomitantly taking allopurinol (HR 0.42; 95% CI 0.30–0.60; NNT 46). Rate of myelotoxicity (HR 0.96; 95% CI 0.89–1.03) was not influenced. Risk of pancreatitis was increased (HR 3.00; 95% CI 1.01–8.93; NNH 337), but was only seen in those with active gout (suggesting confounding by indication). Finally, allopurinol co-users were able to maintain thiopurine therapy over twice as long as those not on allopurinol (3.9 years vs. 1.8 years, P < 0.0001). Conclusion: In thiopurine users, allopurinol is associated with a 58% reduced risk of hepatotoxicity. In addition, thiopurine persistence was prolonged by 2.1 years in allopurinol users. These data support the use of allopurinol in individuals requiring thiopurine therapy

    Risk of Lactic Acidosis or Elevated Lactate Concentrations in Metformin Users With Renal Impairment: A Population-Based Cohort Study

    No full text
    OBJECTIVE The objective of this study was to determine whether treatment with metformin in patients with renal impairment is associated with a higher risk of lactic acidosis or elevated lactate concentrations compared with users of a noninsulin antidiabetic drug (NIAD) who had never used metformin. RESEARCH DESIGN AND METHODS A cohort of 223,968 metformin users and 34,571 diabetic patients who had never used metformin were identified from the Clinical Practice Research Datalink (CPRD). The primary outcome was defined as either a CPRD READ code of lactic acidosis or a record of a plasma lactate concentration &gt;5 mmol/L. The associations between renal impairment, dose of metformin, and the risk of lactic acidosis or elevated lactate concentrations were determined with time-dependent Cox models and expressed as hazard ratios (HRs). RESULTS The crude incidence of lactic acidosis or elevated lactate concentrations in current metformin users was 7.4 per 100,000 person-years (vs. 2.2 per 100,000 person-years in nonusers). Compared with nonusers, risk of lactic acidosis or elevated lactate concentrations in current metformin users was significantly associated with a renal function &lt;60 mL/min/1.73 m 2 (adjusted HR 6.37 [95% CI 1.48-27.5]). The increased risk among patients with impaired renal function was further increased in users of ‡730 g of metformin in the preceding year (adjusted HR 11.8 [95% CI 2.27-61.5]) and in users of a recent high daily dose (&gt;2 g) of metformin (adjusted HR 13.0 [95% CI 2.36-72.0]). CONCLUSIONS Our study is consistent with current recommendations that the renal function of metformin users should be adequately monitored and that the dose of metformin should be adjusted, if necessary, if renal function falls below 60 mL/min/1.73 m 2 . There is good evidence that metformin reduces the long-term incidence of macrovascular complications in type 2 diabetes mellitus, especially among overweight patients (1-3). In contrast to alternative oral noninsulin antidiabetic drugs (NIADs) and insulin, metformin is not associated with a risk of hypoglycemia (3-5). The most serious adverse event that has been observed during metformin use is lactic acidosis, which i

    Risk of lactic acidosis or elevated lactate concentrations in metformin users with renal impairment : a population-based cohort study

    No full text
    OBJECTIVE: The objective of this study was to determine whether treatment with metformin in patients with renal impairment is associated with a higher risk of lactic acidosis or elevated lactate concentrations compared with users of a noninsulin antidiabetic drug (NIAD) who had never used metformin. RESEARCH DESIGN AND METHODS: A cohort of 223,968 metformin users and 34,571 diabetic patients who had never used metformin were identified from the Clinical Practice Research Datalink (CPRD).The primary outcome was defined as either a CPRD READ code lactic acidosis or a record of a plasma lactate concentration >5 mmol/L. The associations between renal impairment, dose of metformin, and the risk of lactic acidosis or elevated lactate concentrations were determined with time-dependent Cox models and expressed as hazard ratios (HRs). RESULTS: The crude incidence of lactic acidosis or elevated lactate concentrations in current metformin users was 7.4 per 100,000 person-years (vs. 2.2 per 100,000 person-years in nonusers). Compared with nonusers, risk of lactic acidosis or elevated lactate concentrations in current metformin users was significantly associated with a renal function 2 g) of metformin (adjusted HR 13.0 [95% CI 2.36-72.0]). CONCLUSIONS: Our study is consistent with current recommendations that the renal function of metformin users should be adequately monitored and that the dose of metformin should be adjusted, if necessary, if renal function falls below 60 mL/min/1.73 m(2)
    corecore