44 research outputs found

    Somnambulism: Emergency Department Admissions Due to Sleepwalking-Related Trauma.

    Get PDF
    INTRODUCTION Somnambulism is a state of dissociated consciousness, in which the affected person is partially asleep and partially awake. There is pervasive public opinion that sleepwalkers are protected from hurting themselves. There have been few scientific reports of trauma associated with somnambulism and no published investigations on the epidemiology or trauma patterns associated with somnambulism. METHODS We included all emergency department (ED) admissions to University Hospital Inselspital, Berne, Switzerland, from January 1, 2000, until August 11, 2015, when the patient had suffered a trauma associated with somnambulism. Demographic data (age, gender, nationality) and medical data (mechanism of injury, final diagnosis, hospital admission, mortality and medication on admission) were included. RESULTS Of 620,000 screened ED admissions, 11 were associated with trauma and sleepwalking. Two patients (18.2%) had a history of known non-rapid eye movement parasomnias. The leading cause of admission was falls. Four patients required hospital admission for orthopedic injuries needing further diagnostic testing and treatment (36.4%). These included two patients with multiple injuries (18.2%). None of the admitted patients died. CONCLUSION Although sleepwalking seems benign in the majority of cases and most of the few injured patients did not require hospitalization, major injuries are possible. When patients present with falls of unknown origin, the possibility should be evaluated that they were caused by somnambulism

    Use of Calcium Channel Blockers is Associated with Mortality in Patients with Chronic Kidney Disease

    Get PDF
    BACKGROUND/AIMS The use of antihypertensive medicines has been shown to reduce proteinuria, morbidity, and mortality in patients with chronic kidney disease (CKD). A specific recommendation for a class of antihypertensive drugs is not available in this population, despite the pharmacodynamic differences. We have therefore analysed the association between antihypertensive medicines and survival of patients with chronic kidney disease. METHODS Out of 2687 consecutive patients undergoing kidney biopsy a cohort of 606 subjects with retrievable medical therapy was included into the analysis. Kidney function was assessed by glomerular filtration rate (GFR) estimation at the time point of kidney biopsy. Main outcome variable was death. RESULTS Overall 114 (18.7%) patients died. In univariate regression analysis the use of alpha-blockers and calcium channel antagonists, progression of disease, diabetes mellitus (DM) type 1 and 2, arterial hypertension, coronary heart disease, peripheral vascular disease, male sex and age were associated with mortality (all p<0.05). In a multivariate Cox regression model the use of calcium channel blockers (HR 1.89), age (HR 1.04), DM type 1 (HR 8.43) and DM type 2 (HR 2.17) and chronic obstructive pulmonary disease (HR 1.66) were associated with mortality (all p < 0.05). CONCLUSION The use of calcium channel blockers but not of other antihypertensive medicines is associated with mortality in primarily GN patients with CKD

    Interprofessional and interdisciplinary simulation-based training leads to safe sedation procedures in the emergency department

    Get PDF
    BACKGROUND Sedation is a procedure required for many interventions in the Emergency department (ED) such as reductions, surgical procedures or cardioversions. However, especially under emergency conditions with high risk patients and rapidly changing interdisciplinary and interprofessional teams, the procedure caries important risks. It is thus vital but difficult to implement a standard operating procedure for sedation procedures in any ED. Reports on both, implementation strategies as well as their success are currently lacking. This study describes the development, implementation and clinical evaluation of an interprofessional and interdisciplinary simulation-based sedation training concept. METHODS All physicians and nurses with specialised training in emergency medicine at the Berne University Department of Emergency Medicine participated in a mandatory interdisciplinary and interprofessional simulation-based sedation training. The curriculum consisted of an individual self-learning module, an airway skill training course, three simulation-based team training cases, and a final practical learning course in the operating theatre. Before and after each training session, self-efficacy, awareness of emergency procedures, knowledge of sedation medication and crisis resource management were assessed with a questionnaire. Changes in these measures were compared via paired tests, separately for groups formed based on experience and profession. To assess the clinical effect of training, we collected patient and team satisfaction as well as duration and complications for all sedations in the ED within the year after implementation. We further compared time to beginning of procedure, time for duration of procedure and time until discharge after implementation with the one year period before the implementation. Cohen's d was calculated as effect size for all statistically significant tests. RESULTS Fifty staff members (26 nurses and 24 physicians) participated in the training. In all subgroups, there is a significant increase in self-efficacy and knowledge with high effect size (d z  = 1.8). The learning is independent of profession and experience level. In the clinical evaluation after implementation, we found no major complications among the sedations performed. Time to procedure significantly improved after the introduction of the training (d = 0.88). DISCUSSION Learning is independent of previous working experience and equally effective in raising the self-efficacy and knowledge in all professional groups. Clinical outcome evaluation confirms the concepts safety and feasibility. CONCLUSION An interprofessional and interdisciplinary simulation-based sedation training is an efficient way to implement a conscious sedation concept in an ED

    Post hoc immunostaining of GABAergic neuronal subtypes following in vivo two-photon calcium imaging in mouse neocortex

    Get PDF
    GABAergic neurons in the neocortex are diverse with regard to morphology, physiology, and axonal targeting pattern, indicating functional specializations within the cortical microcircuitry. Little information is available, however, about functional properties of distinct subtypes of GABAergic neurons in the intact brain. Here, we combined in vivo two-photon calcium imaging in supragranular layers of the mouse neocortex with post hoc immunohistochemistry against the three calcium-binding proteins parvalbumin, calretinin, and calbindin in order to assign subtype marker profiles to neuronal activity. Following coronal sectioning of fixed brains, we matched cells in corresponding volumes of image stacks acquired in vivo and in fixed brain slices. In GAD67-GFP mice, more than 95% of the GABAergic cells could be unambiguously matched, even in large volumes comprising more than a thousand interneurons. Triple immunostaining revealed a depth-dependent distribution of interneuron subtypes with increasing abundance of PV-positive neurons with depth. Most importantly, the triple-labeling approach was compatible with previous in vivo calcium imaging following bulk loading of Oregon Green 488 BAPTA-1, which allowed us to classify spontaneous calcium transients recorded in vivo according to the neurochemically defined GABAergic subtypes. Moreover, we demonstrate that post hoc immunostaining can also be applied to wild-type mice expressing the genetically encoded calcium indicator Yellow Cameleon 3.60 in cortical neurons. Our approach is a general and flexible method to distinguish GABAergic subtypes in cell populations previously imaged in the living animal. It should thus facilitate dissecting the functional roles of these subtypes in neural circuitry

    Retrospective evaluation of whole exome and genome mutation calls in 746 cancer samples

    No full text
    Funder: NCI U24CA211006Abstract: The Cancer Genome Atlas (TCGA) and International Cancer Genome Consortium (ICGC) curated consensus somatic mutation calls using whole exome sequencing (WES) and whole genome sequencing (WGS), respectively. Here, as part of the ICGC/TCGA Pan-Cancer Analysis of Whole Genomes (PCAWG) Consortium, which aggregated whole genome sequencing data from 2,658 cancers across 38 tumour types, we compare WES and WGS side-by-side from 746 TCGA samples, finding that ~80% of mutations overlap in covered exonic regions. We estimate that low variant allele fraction (VAF < 15%) and clonal heterogeneity contribute up to 68% of private WGS mutations and 71% of private WES mutations. We observe that ~30% of private WGS mutations trace to mutations identified by a single variant caller in WES consensus efforts. WGS captures both ~50% more variation in exonic regions and un-observed mutations in loci with variable GC-content. Together, our analysis highlights technological divergences between two reproducible somatic variant detection efforts

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    Kidney biopsy in patients with glomerulonephritis: is the earlier the better?

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Interventional diagnostic procedures are established for several diseases in medicine. Despite the KDOQI guideline recommendation for histological diagnosis of kidney disease to enable risk stratification, its optimal time point has not been evaluated. We have therefore analyzed whether histological diagnosis of glomerulonephritis (GN) at an early stage of chronic kidney disease (CKD) is associated with different outcome compared to diagnosis at a more advanced stage.</p> <p>Methods</p> <p>A cohort of 424 consecutive patients with histological diagnosis of GN were included in a retrospective data analysis. Kidney function was assessed by glomerular filtration rate (GFR) estimation at the time point of kidney biopsy and after consecutive immunosuppressive therapy. Censored events were death, initiation of dialysis or kidney transplantation, or progression of disease, defined as deterioration of CKD stage ≥1 from kidney biopsy to last available kidney function measurement.</p> <p>Results</p> <p>Occurrence of death, dialysis/transplantation or progression of disease were associated with GFR and CKD stage at the time of kidney biopsy (<it>p</it> < 0.001 for all). Patients with CKD stage 1 and 2 at kidney biopsy had fewer endpoints compared to patients with a GFR of <60 ml/min (<it>p</it> < 0.001).</p> <p>Conclusion</p> <p>Kidney function at the time point of histological GN diagnosis is associated with clinical outcome, likely due to early initiation of specific drug treatment. This suggests that selection of therapy yields greatest benefit before renal function is impaired in GN.</p
    corecore