220 research outputs found
THERAPEUTIC DRUG MONITORING OF BETA-LACTAM ANTIBIOTICS IN CRITICALLY ILL PATIENTS WITH SEPSIS
Sepsis is a devastating diagnosis affecting over 750,000 patients a year, accounting for approximately 10% of all hospital admissions, costs more than 17 billion annual spending. The mortality rate for sepsis remains unacceptably high: one out of every three patients diagnosed with sepsis dies. Sepsis physiology induces physiologic changes to drug pharmacokinetic (PK) parameters that alter the ability to achieve the goal pharmacodynamic (PD) target for beta-lactams of \u3e4-fold unbound concentration above the minimum inhibitory concentration for 100% of the dosing interval (100% fT \u3e4x MIC). Sepsis treatment such as volume resuscitation and vasopressor agents increase cardiac output and circulating blood flow, resulting in increased glomerular filtration and enhanced elimination of antibiotics. The PK alterations observed in critically ill septic patients are strongly associated with sub-optimal beta-lactam concentrations. Sub-optimal beta-lactam dosing has resulted in higher rates of therapeutic failure and increased mortality in critically ill patients with sepsis. In addition to the risk of under-exposure, growing data suggest certain beta-lactam combinations are associated with increased nephrotoxicity. Therapeutic drug monitoring of beta-lactam antibiotics is a strategy to improve the outcomes of critically ill septic patients by maximizing efficacy and minimizing toxicity
Use of Continuous Renal Replacement Therapy for Removal of Dabigatran in a Patient in Need of Emergent Surgery
Purpose. To report the ability to remove serum dabigatran using continuous renal replacement therapy (CRRT) in a patient with life-threatening bleeding. Summary. A 77-year-old female with history of atrial fibrillation who takes dabigatran for stroke prevention presented with abdominal pain. Patient was found to have bleeding and possible mesenteric ischemia and was taken to the operating room and had continued bleeding postoperatively. CRRT was initiated for the removal of any remaining dabigatran, with serum dabigatran levels collected to evaluate removal of dabigatran with CRRT. This patient had an increased dabigatran level prior to intervention, which decreased to an undetectable level after use of CRRT. Greater than 80% of the drug was removed due to 4 hours of CRRT and residual kidney function. Reversal of dabigatran is an area of current research with recent FDA approval of idarucizumab for use. Conclusion. Bleeding may occur as a result of the use of dabigatran and change in patient’s clinical condition. Use of CRRT may be an option in removing serum dabigatran in the case of a life-threatening bleed
Use of Continuous Renal Replacement Therapy for Removal of Dabigatran in a Patient in Need of Emergent Surgery
Purpose. To report the ability to remove serum dabigatran using continuous renal replacement therapy (CRRT) in a patient with life-threatening bleeding. Summary. A 77-year-old female with history of atrial fibrillation who takes dabigatran for stroke prevention presented with abdominal pain. Patient was found to have bleeding and possible mesenteric ischemia and was taken to the operating room and had continued bleeding postoperatively. CRRT was initiated for the removal of any remaining dabigatran, with serum dabigatran levels collected to evaluate removal of dabigatran with CRRT. This patient had an increased dabigatran level prior to intervention, which decreased to an undetectable level after use of CRRT. Greater than 80% of the drug was removed due to 4 hours of CRRT and residual kidney function. Reversal of dabigatran is an area of current research with recent FDA approval of idarucizumab for use. Conclusion. Bleeding may occur as a result of the use of dabigatran and change in patient’s clinical condition. Use of CRRT may be an option in removing serum dabigatran in the case of a life-threatening bleed
Pharmacotherapy in Coronavirus Disease 2019 and Risk of Secondary Infections: A Single-Center Case Series and Narrative Review
OBJECTIVES: Since the onset of the coronavirus disease 2019 pandemic, immune modulators have been considered front-line candidates for the management of patients presenting with clinical symptoms secondary to severe acute respiratory syndrome coronavirus 2 infection. Although heavy emphasis has been placed on early clinical efficacy, we sought to evaluate the impact of pharmacologic approach to coronavirus disease 2019 within the ICU on secondary infections and clinical outcomes.
DATA SOURCES: PubMed (inception to March 2021) database search and manual selection of bibliographies from selected articles.
STUDY SELECTION AND DATA EXTRACTION: Articles relevant to coronavirus disease 2019, management of severe acute respiratory syndrome coronavirus 2-associated respiratory failure, and prevalence of secondary infections with pharmacotherapies were selected. The MeSH terms COVID-19, secondary infection, SARS-CoV-2, tocilizumab, and corticosteroids were used for article identification. Articles were narratively synthesized for this review.
DATA SYNTHESIS: Current data surrounding the use of tocilizumab and/or corticosteroids for coronavirus disease 2019 management are limited given the short follow-up period and conflicting results between studies. Further complicating the understanding of immune modulator role is the lack of definitive understanding of clinical impact of the immune response in coronavirus disease 2019.
CONCLUSIONS: Based on the current available literature, we suggest prolonged trials and follow-up intervals for those patients managed with immune modulating agents for the management of coronavirus disease 2019
Acute Skeletal Muscle Wasting and Dysfunction Predict Physical Disability at Hospital Discharge in Patients with Critical Illness
BACKGROUND: Patients surviving critical illness develop muscle weakness and impairments in physical function; however, the relationship between early skeletal muscle alterations and physical function at hospital discharge remains unclear. The primary purpose of this study was to determine whether changes in muscle size, strength and power assessed in the intensive care unit (ICU) predict physical function at hospital discharge.
METHODS: Study design is a single-center, prospective, observational study in patients admitted to the medicine or cardiothoracic ICU with diagnosis of sepsis or acute respiratory failure. Rectus femoris (RF) and tibialis anterior (TA) muscle ultrasound images were obtained day one of ICU admission, repeated serially and assessed for muscle cross-sectional area (CSA), layer thickness (mT) and echointensity (EI). Muscle strength, as measured by Medical Research Council-sum score, and muscle power (lower-extremity leg press) were assessed prior to ICU discharge. Physical function was assessed with performance on 5-times sit-to-stand (5STS) at hospital discharge.
RESULTS: Forty-one patients with median age of 61 years (IQR 55-68), 56% male and sequential organ failure assessment score of 8.1 ± 4.8 were enrolled. RF muscle CSA decreased significantly a median percent change of 18.5% from day 1 to 7 (F = 26.6, p = 0.0253). RF EI increased at a mean percent change of 10.5 ± 21% in the first 7 days (F = 3.28, p = 0.081). At hospital discharge 25.7% of patients (9/35) met criteria for ICU-acquired weakness. Change in RF EI in first 7 days of ICU admission and muscle power measured prior to ICU were strong predictors of ICU-AW at hospital discharge (AUC = 0.912). Muscle power at ICU discharge, age and ICU length of stay were predictive of performance on 5STS at hospital discharge.
CONCLUSION: ICU-assessed muscle alterations, specifically RF EI and muscle power, are predictors of diagnosis of ICU-AW and physical function assessed by 5x-STS at hospital discharge in patients surviving critical illness
Association of Phosphate-Containing versus Phosphate-Free Solutions on Ventilator Days in Patients Requiring Continuous Kidney Replacement Therapy
Background and objectives Hypophosphatemia is commonly observed in patients receiving continuous KRT. Patients who develop hypophosphatemia may be at risk of respiratory and neuromuscular dysfunction and therefore subject to prolongation of ventilator support. We evaluated the association of phosphate-containing versus phosphate-free continuous KRT solutions with ventilator dependence in critically ill patients receiving continuous KRT.
Design, setting, participants, & measurements Our study was a single-center, retrospective, pre-post cohort study of adult patients receiving continuous KRT and mechanical ventilation during their intensive care unit stay. Zeroinflated negative binomial regression with and without propensity score matching was used to model our primary outcome: ventilator-free days at 28 days. Intensive care unit and hospital lengths of stay as well as hospital mortality were analyzed with a t test or a chi-squared test, as appropriate.
Results We identified 992 eligible patients, of whom 649 (65%) received phosphate-containing solutions and 343 (35%) received phosphate-free solutions. In multivariable models, patients receiving phosphate-containing continuous KRT solutions had 12% (95% confidence interval, 0.17 to 0.47) more ventilator-free days at 28 days. Patients exposed to phosphate-containing versus phosphate-free solutions had 17% (95% confidence interval, 20.08 to 20.30) fewer days in the intensive care unit and 20% (95% confidence interval, 2 0.12 to 20.32) fewer days in the hospital. Concordant results were observed for ventilator-free days at 28 days in the propensity score matched analysis. There was no difference in hospital mortality between the groups.
Conclusions The use of phosphate-containing versus phosphate-free continuous KRT solutions was independently associated with fewer ventilator days and shorter stay in the intensive care unit
Recommended from our members
The dynamics of Baroclinic Zonal jets
Multiple alternating zonal jets are a ubiquitous feature of planetary atmospheres and oceans. However, most studies to date have focused on the special case of barotropic jets. Here, the dynamics of freely evolving baroclinic jets are investigated using a two-layer quasigeostrophic annulus model with sloping topography. In a suite of 15 numerical simulations, the baroclinic Rossby radius and baroclinic Rhines scale are sampled by varying the stratification and root-mean-square eddy velocity, respectively. Small-scale eddies in the initial state evolve through geostrophic turbulence and accelerate zonally as they grow in horizontal scale, first isotropically and then anisotropically. This process leads ultimately to the formation of jets, which take about 2500 rotation periods to equilibrate. The kinetic energy spectrum of the equilibrated baroclinic zonal flow steepens from a −3 power law at small scales to a −5 power law near the jet scale. The conditions most favorable for producing multiple alternating baroclinic jets are large baroclinic Rossby radius (i.e., strong stratification) and small baroclinic Rhines scale (i.e., weak root-mean-square eddy velocity). The baroclinic jet width is diagnosed objectively and found to be 2.2–2.8 times larger than the baroclinic Rhines scale, with a best estimate of 2.5 times larger. This finding suggests that Rossby wave motions must be moving at speeds of approximately 6 times the turbulent eddy velocity in order to be capable of arresting the isotropic inverse energy cascade
Cognitive reserve in granulin-related frontotemporal dementia: from preclinical to clinical stages
OBJECTIVE
Consistent with the cognitive reserve hypothesis, higher education and occupation attainments may help persons with neurodegenerative dementias to better withstand neuropathology before developing cognitive impairment. We tested here the cognitive reserve hypothesis in patients with frontotemporal dementia (FTD), with or without pathogenetic granulin mutations (GRN+ and GRN-), and in presymptomatic GRN mutation carriers (aGRN+).
METHODS
Education and occupation attainments were assessed and combined to define Reserve Index (RI) in 32 FTD patients, i.e. 12 GRN+ and 20 GRN-, and in 17 aGRN+. Changes in functional connectivity were estimated by resting state fMRI, focusing on the salience network (SN), executive network (EN) and bilateral frontoparietal networks (FPNs). Cognitive status was measured by FTD-modified Clinical Dementia Rating Scale.
RESULTS
In FTD patients higher level of premorbid cognitive reserve was associated with reduced connectivity within the SN and the EN. EN was more involved in FTD patients without GRN mutations, while SN was more affected in GRN pathology. In aGRN+, cognitive reserve was associated with reduced SN.
CONCLUSIONS
This study suggests that cognitive reserve modulates functional connectivity in patients with FTD, even in monogenic disease. In GRN inherited FTD, cognitive reserve mechanisms operate even in presymptomatic to clinical stages
Development, Implementation and Outcomes of a Quality Assurance System for the Provision of Continuous Renal Replacement Therapy in the Intensive Care Unit
Critically ill patients with requirement of continuous renal replacement therapy (CRRT) represent a growing intensive care unit (ICU) population. Optimal CRRT delivery demands continuous communication between stakeholders, iterative adjustment of therapy, and quality assurance systems. This Quality Improvement (QI) study reports the development, implementation and outcomes of a quality assurance system to support the provision of CRRT in the ICU. This study was carried out at the University of Kentucky Medical Center between September 2016 and June 2019. We implemented a quality assurance system using a step-wise approach based on the (a) assembly of a multidisciplinary team, (b) standardization of the CRRT protocol, (c) creation of electronic CRRT flowsheets, (d) selection, monitoring and reporting of quality metrics of CRRT deliverables, and (e) enhancement of education. We examined 34-month data comprising 1185 adult patients on CRRT (~ 7420 patient-days of CRRT) and tracked selected QI outcomes/metrics of CRRT delivery. As a result of the QI interventions, we increased the number of multidisciplinary experts in the CRRT team and ensured a continuum of education to health care professionals. We maximized to 100% the use of continuous veno-venous hemodiafiltration and doubled the percentage of patients using regional citrate anticoagulation. The delivered CRRT effluent dose (~ 30 ml/kg/h) and the delivered/prescribed effluent dose ratio (~ 0.89) remained stable within the study period. The average filter life increased from 26 to 31 h (p = 0.020), reducing the mean utilization of filters per patient from 3.56 to 2.67 (p = 0.054) despite similar CRRT duration and mortality rates. The number of CRRT access alarms per treatment day was reduced by 43%. The improvement in filter utilization translated into ~ 20,000 USD gross savings in filter cost per 100-patient receiving CRRT. We satisfactorily developed and implemented a quality assurance system for the provision of CRRT in the ICU that enabled sustainable tracking of CRRT deliverables and reduced filter resource utilization at our institution
- …