12 research outputs found

    Cardiac Output Monitoring by Pulse Contour Analysis, the Technical Basics of Less-Invasive Techniques

    No full text
    Routine use of cardiac output (CO) monitoring became available with the introduction of the pulmonary artery catheter into clinical practice. Since then, several systems have been developed that allow for a less-invasive CO monitoring. The so-called “non-calibrated pulse contour systems” (PCS) estimate CO based on pulse contour analysis of the arterial waveform, as determined by means of an arterial catheter without additional calibration. The transformation of the arterial waveform signal as a pressure measurement to a CO as a volume per time parameter requires a concise knowledge of the dynamic characteristics of the arterial vasculature. These characteristics cannot be measured non-invasively and must be estimated. Of the four commercially available systems, three use internal databases or nomograms based on patients’ demographic parameters and one uses a complex calculation to derive the necessary parameters from small oscillations of the arterial waveform that change with altered arterial dynamic characteristics. The operator must ensure that the arterial waveform is neither over- nor under-dampened. A fast-flush test of the catheter–transducer system allows for the evaluation of the dynamic response characteristics of the system and its dampening characteristics. Limitations to PCS must be acknowledged, i.e., in intra-aortic balloon-pump therapy or in states of low- or high-systemic vascular resistance where the accuracy is limited. Nevertheless, it has been shown that a perioperative algorithm-based use of PCS may reduce complications. When considering the method of operation and the limitations, the PCS are a helpful component in the armamentarium of the critical care physician

    Comparison of nasotracheal versus orotracheal intubation for sedation, assisted spontaneous breathing, mobilization, and outcome in critically ill patients: an exploratory retrospective analysis

    No full text
    Abstract Nasotracheal intubation (NTI) may be used for long term ventilation in critically ill patients. Although tracheostomy is often favored, NTI may exhibit potential benefits. Compared to orotracheal intubation (OTI), patients receiving NTI may require less sedation and thus be more alert and with less episodes of depression of respiratory drive. We aimed to study the association of NTI versus OTI with sedation, assisted breathing, mobilization, and outcome in an exploratory analysis. Retrospective data on patients intubated in the intensive care unit (ICU) and ventilated for > 48 h were retrieved from electronic records for up to ten days after intubation. Outcome measures were a Richmond Agitation and Sedation Scale (RASS) of 0 or − 1, sedatives, vasopressors, assisted breathing, mobilization on the ICU mobility scale (ICU-MS), and outcome. From January 2018 to December 2020, 988 patients received OTI and 221 NTI. On day 1–3, a RASS of 0 or − 1 was attained in OTI for 4.0 ± 6.1 h/d versus 9.4 ± 8.4 h/d in NTI, p < 0.001. Propofol, sufentanil, and norepinephrine were required less frequently in NTI and doses were lower. The NTI group showed a higher proportion of spontaneous breathing from day 1 to 7 (day 1–6: p < 0.001, day 7: p = 0.002). ICU-MS scores were higher in the NTI group (d1–d9: p < 0.001, d10: p = 0.012). OTI was an independent predictor for mortality (odds ratio 1.602, 95% confidence interval 1.132–2.268, p = 0.008). No difference in the rate of tracheostomy was found. NTI was associated with less sedation, more spontaneous breathing, and a higher degree of mobilization during physiotherapy. OTI was identified as an independent predictor for mortality. Due to these findings a new prospective evaluation of NTI versus OTI should be conducted to study risks and benefits in current critical care medicine

    Lung aeration and ventilation after percutaneous tracheotomy measured by electrical impedance tomography in non-hypoxemic critically ill patients: a prospective observational study

    No full text
    Abstract Background Percutaneous dilatational tracheotomy (PDT) may lead to transient impairment of pulmonary function due to suboptimal ventilation, loss of positive end-expiratory pressure (PEEP) and repetitive suction maneuvers during the procedure. Possible changes in regional lung aeration were investigated using electrical impedance tomography (EIT), an increasingly implied instrument for bedside monitoring of pulmonary aeration. Methods With local ethics committee approval, after obtaining written informed consent 29 patients scheduled for elective PDT under bronchoscopic control were studied during mechanical ventilation in supine position. Anesthetized patients were monitored with a 16-electrode EIT monitor for 2 min at four time points: (a) before and (b) after initiation of neuromuscular blockade (NMB), (c) after dilatational tracheostomy (PDT) and (d) after a standardized recruitment maneuver (RM) following surgery, respectively. Possible changes in lung aeration were detected by changes in end-expiratory lung impedance (Δ EELI). Global and regional ventilation was characterized by analysis of tidal impedance variation. Results While NMB had no detectable effect on EELI, PDT led to significantly reduced EELI in dorsal lung regions as compared to baseline, suggesting reduced regional aeration. This effect could be reversed by a standardized RM. Mean delta EELI from baseline (SE) was: NMB − 47 ± 62; PDT − 490 ± 180; RM − 89 ± 176, values shown as arbitrary units (a.u.). Analysis of regional tidal impedance variation, a robust measure of regional ventilation, did not show significant changes in ventilation distribution. Conclusion Though changes of EELI might suggest temporary loss of aeration in dorsal lung regions, PDT does not lead to significant changes in either regional ventilation distribution or oxygenation

    Endotracheal tube-mounted camera-assisted intubation versus conventional intubation in intensive care: a prospective, randomised trial (VivaITN)

    No full text
    Abstract Background For critically ill patients, effective airway management with a high first-attempt success rate for endotracheal intubation is essential to prevent hypoxic complications during securing of the airway. Video guidance may improve first-attempt success rate over direct laryngoscopy (DL). Methods With ethics approval, this randomised controlled trial involved 54 critically ill patients who received endotracheal intubation using a tube with an integrated video camera (VivaSight™-SL tube, VST, ETView Ltd., Misgav, Israel) or using conventional intubation under DL. Results The two groups did not differ in terms of intubation conditions. The first-attempt success rate was VST 96% vs. DL 93% (not statistically significant (n. s.)). When intubation at first attempt failed, it was successful in the second attempt in all patients. There was no difference in the median average time to intubation (VST 34 s (interquartile range 28–39) vs. DL 35 s (28–40), n. s.). Neither vomiting nor aspiration or accidental oesophageal intubation were observed in either group. The lowest pulsoxymetric oxygen saturation for VST was 96 (82–99) % vs. 99 (95–100) % for DL (n. s.). Hypotension defined as systolic blood pressure < 70 mmHg occurred in the VST group at 20% vs. the DL group at 15% (n. s.). Conclusion In this pilot study, no advantage was shown for the VST. The VST should be examined further to identify patient groups that could benefit from intubation with the VST, that is, patients with difficult airway conditions. Trial registration ClinicalTrials.gov, NCT02837055. Registered on 13 June 2016

    Acute-on-chronic liver failure alters linezolid pharmacokinetics in critically ill patients with continuous hemodialysis: an observational study

    No full text
    Abstract Background In acute-on-chronic liver failure (ACLF), adequate antibiotic dosing is challenging due to changes of drug distribution and elimination. We studied the pharmacokinetics of linezolid in critically ill patients with ACLF during continuous renal replacement therapy compared to patients without concomitant liver failure (NLF). Methods In this prospective cohort study, patients received linezolid 600 mg bid. Linezolid serum samples were analyzed by high-performance liquid chromatography. Population pharmacokinetic modelling was performed followed by Monte-Carlo simulations of 150 mg bid, 300 mg bid, 450 mg bid, 600 mg bid, and 900 mg bid to assess trough concentration target attainment of 2–7 mg/L. Results Eighteen patients were included in this study with nine suffering from ACLF. Linezolid body clearance was lower in the ACLF group with mean (standard deviation) 1.54 (0.52) L/h versus 6.26 (2.43) L/h for NLF, P < 0.001. A trough concentration of 2–7 mg/L was reached with the standard dose of 600 mg bid in the NLF group in 47%, with 42% being underexposed and 11% overexposed versus 20% in the ACLF group with 77% overexposed and 3% underexposed. The highest probability of target exposure was attained with 600 mg bid in the NLF group and 150 mg bid in the ACLF group with 53%. Conclusion Linezolid body clearance in ACLF was markedly lower than in NLF. Given the overall high variability, therapeutic drug monitoring (TDM) with dose adjustments seems required to optimize target attainment. Until TDM results are available, a dose reduction may be considered in ACLF patients to prevent overexposure

    Voriconazole Pharmacokinetics Are Not Altered in Critically Ill Patients with Acute-on-Chronic Liver Failure and Continuous Renal Replacement Therapy: An Observational Study

    No full text
    Infection and sepsis are a main cause of acute-on-chronic liver failure (ACLF). Besides bacteria, molds play a role. Voriconazole (VRC) is recommended but its pharmacokinetics (PK) may be altered by ACLF. Because ACLF patients often suffer from concomitant acute renal failure, we studied the PK of VRC in patients receiving continuous renal replacement therapy (RRT) with ACLF and compared it to PK of VRC in critically ill patients with RRT without concomitant liver failure (NLF). In this prospective cohort study, patients received weight-based VRC. Pre- and post-dialysis membrane, and dialysate samples obtained at different time points were analyzed by high-performance liquid chromatography. An integrated dialysis pharmacometric model was used to model the available PK data. The recommended, 50% lower, and 50% higher doses were analyzed by Monte-Carlo simulation (MCS) for day 1 and at steady-state with a target trough concentration (TC) of 0.5–3mg/L. Fifteen patients were included in this study. Of these, 6 patients suffered from ACLF. A two-compartment model with linear clearance described VRC PK. No difference for central (V1) or peripheral (V2) volumes of distribution or clearance could be demonstrated between the groups. V1 was 80.6L (95% confidence interval: 62.6–104) and V2 106L (65–166) with a body clearance of 4.7L/h (2.87–7.81) and RRT clearance of 1.46L/h (1.29–1.64). MCS showed TC below/within/above target of 10/74/16% on day 1 and 9/39/52% at steady-state for the recommended dose. A 50% lower dose resulted in 26/72/1% (day 1) and 17/64/19% at steady-state and 7/57/37% and 7/27/67% for a 50% higher dose. VRC pharmacokinetics are not significantly influenced by ACLF in critically ill patients who receive RRT. Maintenance dose should be adjusted in both groups. Due to the high interindividual variability, therapeutic drug monitoring seems inevitable

    Retrospective analysis of central venous catheters in elective intracranial surgery - Is there any benefit?

    No full text
    BackgroundIt remains unclear whether the use of central venous catheters (CVC) improves a patient's clinical outcome after elective intracranial supratentorial procedures.MethodsThis two-armed, single-center retrospective study sought to compare patients undergoing elective intracranial surgery with and without CVCs. Standard anaesthesia procedures were modified during the study period resulting in the termination of obligatory CVC instrumentation for supratentorial procedures. Peri-operative adverse events (AEs) were evaluated as primary endpoint.ResultsThe data of 621 patients in total was analysed in this study (301 with and 320 without CVC). Patient characteristics and surgical procedures were comparable between both study groups. A total of 132 peri-operative AEs (81 in the group with CVC vs. 51 in the group without CVC) regarding neurological, neurosurgical, cardiovascular events and death were observed. CVC patients suffer from AEs almost twice as often as non CVC patients (ORadjusted = 1.98; 95%CI[1.28-3.06]; p = 0.002). Complications related to catheter placement (pneumothorax and arterial malpuncture) were observed in 1.0% of the cases. The ICU treatment period in patients with CVC was 22 (19;24) vs. 21 (19;24) hours (p = 0.413). The duration of hospital stay was also similar between groups (9 (7;13) vs. 8 (7;11) days, p = 0.210). The total time of ventilation (350 (300;440) vs. 335 (281;405) min, p = 0.003) and induction time (40 (35;50) vs. 30 (25;35) min, pConclusionThe data of our retrospective study suggests that patients undergoing elective neurosurgical procedures with CVCs do not demonstrate any additional benefits in comparison to patients without a CVC
    corecore