4 research outputs found

    Monitoring drug therapy in hospitalized patients

    No full text
    Prevention of adverse drug events that may result from medication errors is challenging. The safety of medication treatment is mostly determined on an average population and medication errors may be prevented when pharmacotherapy is better tailored to the individualized needs of the hospitalized patients. Especially the stages of prescribing and monitoring of the medication management process have shown to be prone for errors. This thesis therefore describes the frequency and potential clinical relevance of drug therapy monitoring with laboratory markers in hospitalized patients with special focus on potential drug-drug interactions (pDDIs), potassium and ICU patients. The frequency, nature and determinants of pDDIs were determined in both general hospital population and intensive care unit (ICU) patients. pDDIs occurred in 54% of the ICU patients and in 25% of the patients hospitalized at general departments. In both settings, the top-10 drug-drug pairs were responsible for the majority of the pDDI alerts, i.e. for 53% of the pDDI alerts in general departments and for 79% at the ICU. The most frequently occurring possible outcome was an increased risk of side effects and the most frequently advised risk mitigation strategy was laboratory monitoring. Three laboratory markers that are often involved in clinical risk management of drug therapy, serum potassium, sodium and creatinine levels were measured in approximately 50% of the hospitalized patients. Patient related factors appeared to be stronger predictors for monitoring than the use of specific medications. Only when medication was used with an intended effect on electrolytes, measurement was performed more frequently. The percentage of patients with a measurement within the last 48 hours before discharge was less than 25%, suggesting that there is room for improvement in the communication of relevant laboratory values at transition of care. Compared to patients using only one serum potassium increasing drug, serum potassium levels were measured slightly more frequent in patients using two or more serum potassium increasing drugs concomitantly (67% vs 58%). Although prescribers received a direct pop-up to monitor serum potassium levels, serum potassium levels were not measured in 33% of the patients and 10% of the patients developed hyperkalemia. When patients were using both a potassium increasing (PID) and potassium decreasing drug (PDD), and the potassium lowering drug was stopped, serum potassium levels increased in 59% of the patients and 3.2% developed a hyperkalemia (potassium>5.5 mmol/L). When the potassium increasing drug was stopped, serum potassium levels decreased in 70% of the patients and 17% developed a hypokalemia (potassium<3.5 mmol/L). Insulin that is used in tight glucose protocols (TGC) on the ICU may also influence serum potassium levels but TGC was not associated with an increased risk of hypokalemia. High and low mean serum glucose and potassium levels and high variability, however, were associated with an increased ICU mortality. All studies show the importance of laboratory monitoring as a risk mitigation strategy in drug therapy. Patient safety may therefore be optimized when laboratory markers are included in the risk mitigation strategies of clinical decision support systems on medications

    Lithium lacks effect on survival in amyotrophic lateral sclerosis: a phase IIb randomised sequential trial.

    No full text
    Item does not contain fulltextOBJECTIVES: To determine the safety and efficacy of lithium for the treatment of amyotrophic lateral sclerosis (ALS) in a randomised, placebo controlled, double blind, sequential trial. METHODS: Between November 2008 and June 2011, 133 patients were randomised to receive lithium carbonate (target blood level 0.4-0.8 mEq/l) or placebo as add-on treatment with riluzole. The primary endpoint was survival, defined as death, tracheostomal ventilation or non-invasive ventilation for more than 16 h/day. Secondary outcome measures consisted of the revised ALS Functional Rating Scale and forced vital capacity. Analysis was by intention to treat and according to a sequential trial design. RESULTS: 61 patients reached a primary endpoint, 33 of 66 in the lithium group and 28 of 67 patients in the placebo group. Lithium did not significantly affect survival (cumulative survival probability of 0.73 in the lithium group (95% CI 0.63 to 0.86) vs 0.75 in the placebo group (95% CI 0.65 to 0.87) at 12 months and 0.62 in the lithium group (95% CI 0.50 to 0.76) vs 0.67 in the placebo group (95% CI 0.56 to 0.81) at 16 months). Secondary outcome measures did not differ between treatment groups. No major safety concerns were encountered. CONCLUSIONS: This trial, designed to detect a modest effect of lithium, did not demonstrate any beneficial effect on either survival or functional decline in patients with ALS. TRIAL REGISTRATION NUMBER: NTR1448. Name of trial registry: Lithium trial in ALS.1 mei 201

    Decontamination of the digestive tract and oropharynx in ICU patients.

    Get PDF
    Contains fulltext : 79996.pdf (publisher's version ) (Open Access)BACKGROUND: Selective digestive tract decontamination (SDD) and selective oropharyngeal decontamination (SOD) are infection-prevention measures used in the treatment of some patients in intensive care, but reported effects on patient outcome are conflicting. METHODS: We evaluated the effectiveness of SDD and SOD in a crossover study using cluster randomization in 13 intensive care units (ICUs), all in The Netherlands. Patients with an expected duration of intubation of more than 48 hours or an expected ICU stay of more than 72 hours were eligible. In each ICU, three regimens (SDD, SOD, and standard care) were applied in random order over the course of 6 months. Mortality at day 28 was the primary end point. SDD consisted of 4 days of intravenous cefotaxime and topical application of tobramycin, colistin, and amphotericin B in the oropharynx and stomach. SOD consisted of oropharyngeal application only of the same antibiotics. Monthly point-prevalence studies were performed to analyze antibiotic resistance. RESULTS: A total of 5939 patients were enrolled in the study, with 1990 assigned to standard care, 1904 to SOD, and 2045 to SDD; crude mortality in the groups at day 28 was 27.5%, 26.6%, and 26.9%, respectively. In a random-effects logistic-regression model with age, sex, Acute Physiology and Chronic Health Evaluation (APACHE II) score, intubation status, and medical specialty used as covariates, odds ratios for death at day 28 in the SOD and SDD groups, as compared with the standard-care group, were 0.86 (95% confidence interval [CI], 0.74 to 0.99) and 0.83 (95% CI, 0.72 to 0.97), respectively. CONCLUSIONS: In an ICU population in which the mortality rate associated with standard care was 27.5% at day 28, the rate was reduced by an estimated 3.5 percentage points with SDD and by 2.9 percentage points with SOD. (Controlled Clinical Trials number, ISRCTN35176830.
    corecore