204 research outputs found

    Oxygenation during general anesthesia in pediatric patients:A retrospective observational study

    Get PDF
    Study objective: Protocols are used in intensive care and emergency settings to limit the use of oxygen. However, in pediatric anesthesiology, such protocols do not exist. This study aimed to investigate the administration of oxygen during pediatric general anesthesia and related these values to PaO2, SpO2 and SaO2. Design: Retrospective observational study. Setting: Tertiary pediatric academic hospital, from June 2017 to August 2020. Patients:Patients aged 0–18 years who underwent general anesthesia for a diagnostic or surgical procedure with tracheal intubation and an arterial catheter for regular blood withdrawal were included. Patients on cardiopulmonary bypass or those with missing data were excluded. Electronic charts were reviewed for patient characteristics, type of surgery, arterial blood gas analyses, and oxygenation management.Interventions: No interventions were done. Measurements: Primary outcome defined as FiO2, PaO2 and SpO2 values were interpreted using descriptive analyses, and the correlation between PaO2 and FiO2 was determined using the weighted Spearman correlation coefficient. Main results: Data of 493 cases were obtained. Of these, 267 were excluded for various reasons. Finally, 226 cases with a total of 645 samples were analyzed. The median FiO2 was 36% (IQR 31 to 43), with a range from 20% to 97%, and the median PaO2 was 23.6 kPa (IQR 18.6 to 28.1); 177 mmHg (IQR 140 to 211). The median SpO2 level was 99% (IQR 98 to 100%). The study showed a moderately positive association between PaO2 and FiO2 (r = 0.52, p &lt; 0.001). 574 of 645 samples (89%) contained a PaO2 higher than 13.3 kPa; 100 mmHg. Conclusions: Oxygen administration during general pediatric anesthesia is barely regulated. Hyperoxemia is observed intraoperatively in approximately 90% of cases. Future research should focus on outcomes related to hyperoxemia.</p

    Compliance of general practitioners with a guideline-based decision support system for ordering blood tests

    Get PDF
    BACKGROUND: Guidelines are viewed as a mechanism for disseminating a rapidly increasing body of knowledge. We determined the compliance of Dutch general practitioners with the recommendations for blood test ordering as defined in the guidelines of the Dutch College of General Practitioners. METHODS: We performed an audit of guideline compliance over a 12-month period (March 1996 through February 1997). In an observational study, a guideline-based decision support system for blood test ordering, BloodLink, was integrated with the electronic patient records of 31 general practitioners practicing in 23 practices (16 solo). BloodLink followed the guidelines of the Dutch College of General Practitioners. We determined compliance by comparing the recommendations for test ordering with the test(s) actually ordered. Compliance was expressed as the percentage of order forms that followed the recommendations for test ordering. RESULTS: Of 12 668 orders generated, 9091 (71%) used the decision-support software rather than the paper order forms. Twelve indications accounted for >80% of the 7346 order forms that selected a testing indication in BloodLink. The most frequently used indication for test ordering was "vague complaints" (2209 order forms; 30.1%). Of the 7346 order forms, 39% were compliant. The most frequent type of noncompliance was the addition of tests. Six of the 12 tests most frequently added to the order forms were supported by revisions of guidelines that occurred within 3 years after the intervention period. CONCLUSIONS: In general practice, noncompliance with guidelines is predominantly caused by adding tests. We conclude that noncompliance with a guideline seems to be partly caused by practitioners applying new medical insight before it is incorporated in a revision of that guideline

    Predicting deterioration of patients with early sepsis at the emergency department using continuous heart rate variability analysis:a model-based approach

    Get PDF
    Background: Sepsis is a life-threatening disease with an in-hospital mortality rate of approximately 20%. Physicians at the emergency department (ED) have to estimate the risk of deterioration in the coming hours or days and decide whether the patient should be admitted to the general ward, ICU or can be discharged. Current risk stratification tools are based on measurements of vital parameters at a single timepoint. Here, we performed a time, frequency, and trend analysis on continuous electrocardiograms (ECG) at the ED to try and predict deterioration of septic patients. Methods: Patients were connected to a mobile bedside monitor that continuously recorded ECG waveforms from triage at the ED up to 48 h. Patients were post-hoc stratified into three groups depending on the development of organ dysfunction: no organ dysfunction, stable organ dysfunction or progressive organ dysfunction (i.e., deterioration). Patients with de novo organ dysfunction and those admitted to the ICU or died were also stratified to the group of progressive organ dysfunction. Heart rate variability (HRV) features over time were compared between the three groups. Results: In total 171 unique ED visits with suspected sepsis were included between January 2017 and December 2018. HRV features were calculated over 5-min time windows and summarized into 3-h intervals for analysis. For each interval, the mean and slope of each feature was calculated. Of all analyzed features, the average of the NN-interval, ultra-low frequency, very low frequency, low frequency and total power were different between the groups at multiple points in time. Conclusions: We showed that continuous ECG recordings can be automatically analyzed and used to extract HRV features associated with clinical deterioration in sepsis. The predictive accuracy of our current model based on HRV features derived from the ECG only shows the potential of HRV measurements at the ED. Unlike other risk stratification tools employing multiple vital parameters this does not require manual calculation of the score and can be used on continuous data over time. Trial registration The protocol of this study is published by Quinten et al., 2017.</p

    Prognostic value of serial score measurements of the national early warning score, the quick sequential organ failure assessment and the systemic inflammatory response syndrome to predict clinical outcome in early sepsis

    Get PDF
    BACKGROUND AND IMPORTANCE: Sepsis is a common and potentially lethal syndrome, and early recognition is critical to prevent deterioration. Yet, currently available scores to facilitate recognition of sepsis lack prognostic accuracy. OBJECTIVE: To identify the optimal time-point to determine NEWS, qSOFA and SIRS for the prediction of clinical deterioration in early sepsis and to determine whether the change in these scores over time improves their prognostic accuracy. DESIGN: Post hoc analysis of prospectively collected data. SETTINGS AND PARTICIPANTS: This study was performed in the emergency department (ED) of a tertiary-care teaching hospital. Adult medical patients with (potential) sepsis were included. OUTCOME MEASURES AND ANALYSIS: The primary outcome was clinical deterioration within 72 h after admission, defined as organ failure development, the composite outcome of ICU-admission and death. Secondary outcomes were the composite of ICU-admission/death and a rise in SOFA at least 2. Scores were calculated at the ED with 30-min intervals. ROC analyses were constructed to compare the prognostic accuracy of the scores. RESULTS: In total, 1750 patients were included, of which 360 (20.6%) deteriorated and 79 (4.5%) went to the ICU or died within 72 h. The NEWS at triage (AUC, 0.62; 95% CI, 0.59-0.65) had a higher accuracy than qSOFA (AUC, 0.60; 95% CI, 0.56-0.63) and SIRS (AUC, 0.59; 95% CI, 0.56-0.63) for predicting deterioration. The AUC of the NEWS at 1 h (0.65; 95% CI, 0.63-0.69) and 150 min after triage (0.64; 95% CI, 0.61-0.68) was higher than the AUC of the NEWS at triage. The qSOFA had the highest AUC at 90 min after triage (0.62; 95% CI, 0.58-0.65), whereas the SIRS had the highest AUC at 60 min after triage (0.60; 95% CI, 0.56-0.63); both are not significantly different from triage. The NEWS had a better accuracy to predict ICU-admission/death <72 h compared with qSOFA (AUC difference, 0.092) and SIRS (AUC difference, 0.137). No differences were found for the prediction of a rise in SOFA at least 2 within 72 h between the scores. Patients with the largest improvement in any of the scores were more prone to deteriorate. CONCLUSION: NEWS had a higher prognostic accuracy to predict deterioration compared with SIRS and qSOFA; the highest accuracy was reached at 1 h after triage

    Cohort profile of Acutelines:a large data/biobank of acute and emergency medicine

    Get PDF
    Purpose Research in acute care faces many challenges, including enrolment challenges, legal limitations in data sharing, limited funding and lack of singular ownership of the domain of acute care. To overcome these challenges, the Center of Acute Care of the University Medical Center Groningen in the Netherlands, has established a de novo data, image and biobank named ‘Acutelines’.Participants Clinical data, imaging data and biomaterials (ie, blood, urine, faeces, hair) are collected from patients presenting to the emergency department (ED) with a broad range of acute disease presentations. A deferred consent procedure (by proxy) is in place to allow collecting data and biomaterials prior to obtaining written consent. The digital infrastructure used ensures automated capturing of all bed-side monitoring data (ie, vital parameters, electrophysiological waveforms) and securely importing data from other sources, such as the electronic health records of the hospital, ambulance and general practitioner, municipal registration and pharmacy. Data are collected from all included participants during the first 72 hours of their hospitalisation, while follow-up data are collected at 3 months, 1 year, 2 years and 5 years after their ED visit.Findings to date Enrolment of the first participant occurred on 1 September 2020. During the first month, 653 participants were screened for eligibility, of which 180 were approached as potential participants. In total, 151 (84%) provided consent for participation of which 89 participants fulfilled criteria for collection of biomaterials.Future plans The main aim of Acutelines is to facilitate research in acute medicine by providing the framework for novel studies and issuing data, images and biomaterials for future research. The protocol will be extended by connecting with central registries to obtain long-term follow-up data, for which we already request permission from the participant.Trial registration number NCT04615065

    Effects of propranolol on fear of dental extraction: Study protocol for a randomized controlled trial

    Get PDF
    Background: Undergoing an extraction has been shown to pose a significantly increased risk for the development of chronic apprehension for dental surgical procedures, disproportionate forms of dental anxiety (that is, dental phobia), and symptoms of post-traumatic stress. Evidence suggests that intrusive emotional memories of these events both induce and maintain these forms of anxiety. Addressing these problems effectively requires an intervention that durably reduces both the intrusiveness of key fear-related memories and state anxiety during surgery. Moreover, evidence suggests that propranolol is capable of inhibiting "memory reconsolidation" (that is, it blocks the process of storing a recently retrieved fear memory). Hence, the purpose of this trial is to determine the anxiolytic and fear memory reconsolidation inhibiting effects of the ß-adrenoreceptor antagonist propranolol on patients with high levels of fear in anticipation of a dental extraction. Methods/Design: This trial is designed as a multicenter, randomized, placebo-controlled, two-group, parallel, double-blind trial of 34 participants. Consecutive patients who have been referred by their dentist to the departments of oral and maxillofacial surgery of a University hospital or a secondary referral hospital in the Netherlands for at least two tooth and/or molar removals and with self-reported high to extreme fear in anticipation of a dental extraction will be recruited. The intervention is the administration of two 40 mg propranolol capsules 1 hour prior to a dental extraction, followed by one 40 mg capsule directly postoperatively. Placebo capsules will be used as a comparator. The primary outcome will be dental trait anxiety score reduction from baseline to 4-weeks follow-up. The secondary outcomes will be self-reported anxiety during surgery, physiological parameters (heart rate and blood pressure) during recall of the crucial fear-related memory, self-reported vividness, and emotional charge of the crucial fear-related memory. Discussion: This randomized trial is the first to test the efficacy of 120 mg of perioperative propranolol versus placebo in reducing short-term ("state") anxiety during dental extraction, fear memory reconsolidation, and lasting dental ("trait") anxiety in a clinical population. If the results show a reduction in anxiety, this would offer support for routinely prescribing propranolol in patients who are fearful of undergoing dental extractions. Trial registration: ClinicalTrials.gov identifier: NCT02268357 , registered on 7 October 2014. The Netherlands National Trial Register identifier: NTR5364 , registered on 16 August 2015

    IgE Cross-Reactivity of Cashew Nut Allergens

    Get PDF
    Background: Allergic sensitisation towards cashew nut often happens without a clear history of eating cashew nut. IgE cross-reactivity between cashew and pistachio nut is well described; however, the ability of cashew nut-specific IgE to cross-react to common tree nut species and other Anacardiaceae, like mango, pink peppercorn, or sumac is largely unknown. Objectives: Cashew nut allergic individuals may cross-react to foods that are phylogenetically related to cashew. We aimed to determine IgE cross-sensitisation and cross-reactivity profiles in cashew nut-sensitised subjects, towards botanically related proteins of other Anacardiaceae family members and related tree nut species. Method: Sera from children with a suspected cashew nut allergy (n = 56) were assessed for IgE sensitisation to common tree nuts, mango, pink peppercorn, and sumac using dot blot technique. Allergen cross-reactivity patterns between Anacardiaceae species were subsequently examined by SDS-PAGE and immunoblot inhibition, and IgE-reactive allergens were identified by LC-MS/MS. Results: From the 56 subjects analysed, 36 were positive on dot blot for cashew nut (63%). Of these, 50% were mono-sensitised to cashew nuts, 19% were co-sensitised to Anacardiaceae species, and 31% were co-sensitised to tree nuts. Subjects co-sensitised to Anacardiaceae species displayed a different allergen recognition pattern than subjects sensitised to common tree nuts. In pink peppercorn, putative albumin- and legumin-type seed storage proteins were found to cross-react with serum of cashew nut-sensitised subjects in vitro. In addition, a putative luminal binding protein was identified, which, among others, may be involved in cross-reactivity between several Anacardiaceae species. Conclusions: Results demonstrate the in vitro presence of IgE cross-sensitisation in children towards multiple Anacardiaceae species. In this study, putative novel allergens were identified in cashew, pistachio, and pink peppercorn, which may pose factors that underlie the observed cross-sensitivity to these species. The clinical relevance of this widespread cross-sensitisation is unknown.</p

    New paths for modelling freshwater nature futures

    Get PDF
    Freshwater ecosystems are exceptionally rich in biodiversity and provide essential benefits to people. Yet they are disproportionately threatened compared to terrestrial and marine systems and remain underrepresented in the scenarios and models used for global environmental assessments. The Nature Futures Framework (NFF) has recently been proposed to advance the contribution of scenarios and models for environmental assessments. This framework places the diverse relationships between people and nature at its core, identifying three value perspectives as points of departure: Nature for Nature, Nature for Society, and Nature as Culture. We explore how the NFF may be implemented for improved assessment of freshwater ecosystems. First, we outline how the NFF and its main value perspectives can be translated to freshwater systems and explore what desirable freshwater futures would look like from each of the above perspectives. Second, we review scenario strategies and current models to examine how freshwater modelling can be linked to the NFF in terms of its aims and outcomes. In doing so, we also identify which aspects of the NFF framework are not yet captured in current freshwater models and suggest possible ways to bridge them. Our analysis provides future directions for a more holistic freshwater model and scenario development and demonstrates how society can benefit from freshwater modelling efforts that are integrated with the value-perspectives of the NFF. Graphical abstract: [Figure not available: see fulltext.]</p
    corecore