53 research outputs found

    Retrospective evaluation of whole exome and genome mutation calls in 746 cancer samples

    No full text
    Funder: NCI U24CA211006Abstract: The Cancer Genome Atlas (TCGA) and International Cancer Genome Consortium (ICGC) curated consensus somatic mutation calls using whole exome sequencing (WES) and whole genome sequencing (WGS), respectively. Here, as part of the ICGC/TCGA Pan-Cancer Analysis of Whole Genomes (PCAWG) Consortium, which aggregated whole genome sequencing data from 2,658 cancers across 38 tumour types, we compare WES and WGS side-by-side from 746 TCGA samples, finding that ~80% of mutations overlap in covered exonic regions. We estimate that low variant allele fraction (VAF < 15%) and clonal heterogeneity contribute up to 68% of private WGS mutations and 71% of private WES mutations. We observe that ~30% of private WGS mutations trace to mutations identified by a single variant caller in WES consensus efforts. WGS captures both ~50% more variation in exonic regions and un-observed mutations in loci with variable GC-content. Together, our analysis highlights technological divergences between two reproducible somatic variant detection efforts

    Pediatric Training Crisis of Emergency Medicine Residency during the COVID-19 Pandemic

    No full text
    Coronavirus disease 2019 (COVID-19) is an emerging viral disease that has caused a global pandemic. Among emergency department (ED) patients, pediatric patient volume mostly and continuously decreased during the pandemic period. Decreased pediatric patient volume in a prolonged period could results in inadequate pediatric training of Emergency Medicine (EM) residents. We collected data regarding pediatric patients who were first seen by EM resident physicians between 1 February 2019, and 31 January 2021, which was divided into pre-epidemic and epidemic periods by 1 February 2020. A significant reduction in pediatric patients per hour (PPH) of EM residents was noted in the epidemic period (from 1.55 to 0.81, p &lt; 0.001). The average patient number was reduced significantly in the classification of infection (from 9.50 to 4.00, p &lt; 0.001), respiratory system (from 84.00 to 22.00, p &lt; 0.001), gastrointestinal system (from 52.00 to 34.00, p = 0.007), otolaryngology (from 4.00 to 2.00, p = 0.022). Among the diagnoses of infectious disease, the most obvious drop was noted in the diagnosis of influenza and enterovirus infection. Reduced pediatric patient volume affected clinical exposure to pediatric EM training of EM residency. Changes in the proportion of pediatric diseases presented in the ED may induce inadequate experience with common and specific pediatric diseases

    Introduction of a mass burn casualty triage system in a hospital during a powder explosion disaster: a retrospective cohort study

    No full text
    Abstract Background The triage system used during an actual mass burn casualty (MBC) incident is a major focus of concern. This study introduces a MBC triage system that was used by a burn center during an actual MBC incident following a powder explosion in New Taipei City, Taiwan. Methods This study retrospectively analyzed data from patients who were sent to the study hospital during a MBC incident. The patient list was retrieved from a national online management system. A MBC triage system was developed at the study hospital using the following modifiers: consciousness, breathing, and burn size. Medical records were retrieved from electronic records for analysis. Patient outcomes consisted of emergency department (ED) disposition and intervention. Results The patient population was predominantly female (56.3%), with an average age of 24.9 years. Mean burn sizes relative to the TBSA of triage level I, II, and III patients were 57.9%, 40.5%, and 8.7%, respectively. ICU length of stay differed markedly according to triage level (mean days for levels I vs II vs III: 57.9 vs 39.9 vs 2.5 days; p < 0.001). Triage system levels I and II indicate ICU admission with a sensitivity of 93.9% (95%CI 80.4–98.3%) and a specificity of 86.7% (62.1–96.3%). Overall, 3 (6.3%) patients were under-triaged. Two (4.2%) patients were over-triaged. Sixteen (48.5%) and 21 (63.6%) patients of triage levels I and II received endotracheal intubation and central venous catheterization, respectively. Sorting of the study population with simple triage and rapid treatment (START) showed great sensitivity (100.0%) but poor specificity (53.3%). The Taiwan Triage and Acuity Scale (TTAS) presented 87.9% sensitivity and 93.9% specificity. Conclusions The current MBC triage algorithm served as a good indicator of ED disposition but might have raised excessive immediate attention and had the potential to exhaust the available resources. These findings add to our knowledge of the MBC triage system and should help future researchers in adjusting the triage criteria to fit actual disasters

    One-year survival rate and healthcare costs after cardiac arrest in Taiwan, 2006-2012.

    No full text
    The annual increase in costs and the quality of life of survivors of cardiac arrest are major concerns. This study used National Health Insurance Research Database (NHIRD) of Taiwan to evaluate the 1-year survival rate and the annual healthcare costs of survivors after cardiac arrest.This retrospective, fixed-cohort study conducted from 2006 to 2012, involved 2 million individuals randomly selected from the NHIRD of Taiwan. Adult patients at least 18 years old who were diagnosed with cardiac arrest were enrolled. Survival was followed up for 1 year.In total, 2,256 patients were enrolled. The survivor cohort accounted for 4% (89/2256) of the study population. There were no significant differences in the demographic characteristics of the survival and non-survival cohorts, with the exceptions of gender (male: survival vs. non-survival, 50.6% vs. 64.5%, p = 0.007), diabetes mellitus (49.4% vs. 35.8%, p = 0.009), and acute coronary syndrome (44.9% vs. 31.9%, p = 0.010). Only 38 (1.7%) patients survived for > 1 year. The mean re-admission to hospital during the 1-year follow up was 73.5 (SD: 110.2) days. The mean healthcare cost during the 1-year follow up was $12,953. Factors associated with total healthcare costs during the 1-year follow up were as follows: city or county of residence, being widowed, and Chronic Obstructive Pulmonary Disease (city or county of residence, β: -23,604, p < 0.001; being widowed, β: 25,588, p = 0.049; COPD, β: 14,438, p = 0.024).There was a great burden of the annual healthcare costs of survivors of cardiac arrest. Socioeconomic status and comorbidity were major confounders of costs. The outcome measures of cardiac arrest should extend beyond the death, and encompass destitution. These findings add to our knowledge of the health economics and indicate future research about healthcare of cardiac arrest survivors

    In-Hospital and Long-Term Outcomes in Patients with Head and Neck Cancer Bleeding

    No full text
    Background and Objectives: The purpose of the present study was to elucidate the in-hospital and long-term outcomes of patients with head and neck cancer (HNC) bleeding and to analyze the risk factors for mortality. Materials and Methods: We included patients who presented to the emergency department (ED) with HNC bleeding. Variables of patients who survived and died were compared and associated factors were investigated by logistic regression and Cox&rsquo;s proportional hazard model. Results: A total of 125 patients were enrolled in the present study. Fifty-nine (52.8%) patients experienced a recurrent bleeding event. The in-hospital mortality rate was 16%. The overall survival at 1, 3 and 5 years was 48%, 41% and 34%, respectively. The median survival time was 9.2 months. Multivariate logistic regression analyses revealed that risk factors for in-hospital mortality were inotropic support (OR = 10.41; Cl 1.81&ndash;59.84; p = 0.009), hypopharyngeal cancer (OR = 4.32; Cl 1.29&ndash;14.46; p = 0.018), and M stage (OR = 5.90; Cl 1.07&ndash;32.70; p = 0.042). Multivariate Cox regression analyses indicate that heart rate &gt;110 (beats/min) (HR = 2.02; Cl 1.16&ndash;3.51; p = 0.013), inotropic support (HR = 3.25; Cl 1.20&ndash;8.82; p = 0.021), and hypopharygneal cancer (HR = 2.22; Cl 1.21&ndash;4.06; p = 0.010) were all significant independent predictors of poorer overall survival. Conclusions: HNC bleeding commonly represents the advanced disease stage. Recognition of associated factors aids in the risk stratification of patients with HNC bleeding

    Clinical Characteristics and In-Hospital Outcomes in Dialysis Patients with Septic Arthritis

    No full text
    Background and Objectives: Septic arthritis is a medical emergency associated with high morbidity and mortality. The incidence rate of septic arthritis among dialysis patients is higher than the general population, and dialysis patients with bacteremia frequently experience adverse outcomes. The aim of this study was to identify the clinical features and risk factors for longer hospital length of stay (LOS), positive blood culture, and in-hospital mortality in dialysis patients with septic arthritis. Materials and Methods: The medical records of 52 septic arthritis dialysis patients admitted to our hospital from 1 January 2009 to 31 December 2020 were analyzed. The primary outcomes were bacteremia and in-hospital mortality. Variables were compared, and risk factors were evaluated using linear and logistic regression models. Results: Twelve (23.1%) patients had positive blood cultures. A tunneled cuffed catheter for dialysis access was used in eight (15.4%) patients, and its usage rate was significantly higher in patients with positive blood culture than in those with negative blood culture (41.7 vs. 7.5%, p = 0.011). Fever was present in 15 (28.8%) patients, and was significantly more frequent in patients with positive blood culture (58.3 vs. 20%, p = 0.025). The most frequently involved site was the hip (n = 21, 40.4%). The most common causative pathogen was Gram-positive cocci, with MRSA (n = 7, 58.3%) being dominant. The mean LOS was 29.9 ± 25.1 days. The tunneled cuffed catheter was a significant predictor of longer LOS (Coef = 0.49; Cl 0.25–0.74; p p = 0.037) and tunneled cuffed catheter (OR = 7.60; Cl 1.31–44.02; p = 0.024). The predictor of mortality was tunneled cuffed catheter (OR = 14.33; Cl 1.12–183.18; p = 0.041). Conclusions: In the dialysis population, patients with tunneled cuffed catheter for dialysis access had a significantly longer hospital LOS. Tunneled cuffed catheter and fever were independent predictors of positive blood culture, and tunneled cuffed catheter was the predictor of in-hospital mortality. The recognition of the associated factors allows for risk stratification and determination of the optimal treatment plan in dialysis patients with septic arthritis

    Comparison Between Canadian Triage and Acuity Scale and Taiwan Triage System in Emergency Departments

    Get PDF
    Since the implementation of National Health Insurance in Taiwan, Emergency Department (ED) volume has progressively increased, and the current triage system is insufficient and needs modification. This study compared the prioritization and resource utilization differences between the four-level Taiwan Triage System (TTS) and the standardized five-level Canadian Triage and Acuity Scale (CTAS) among ED patients. Methods: This was a prospective observational study. All adult ED patients who presented to three different medical centers during the study period were included. Patients were independently triaged by the duty triage nurse using TTS, and a single trained research nurse using CTAS with a computer support software system. Hospitalization, length of stay (LOS), and medical resource consumption were analyzed by comparing TTS and CTAS by acuity levels. Results: There was significant disparity in patient prioritization between TTS and CTAS among the 1851 enrolled patients. With TTS, 7.8%, 46.1%, 45.9% and 0.2% were assigned to levels 1, 2, 3, and 4, respectively. With CTAS, 3.5%, 24.4%, 44.3%, 22.4% and 5.5% were assigned to levels 1, 2, 3, 4, and 5, respectively. The hospitalization rate, LOS, and medical resource consumption differed significantly between the two triage systems and correlated better with CTAS. Conclusion: CTAS provided better discrimination for ED patient triage, and also showed greater validity when predicting hospitalization, LOS, and medical resource consumption. An accurate five-level triage scale appeared superior in predicting patient acuity and resource utilization

    A Machine Learning Model for Predicting Unscheduled 72 h Return Visits to the Emergency Department by Patients with Abdominal Pain

    No full text
    Seventy-two-hour unscheduled return visits (URVs) by emergency department patients are a key clinical index for evaluating the quality of care in emergency departments (EDs). This study aimed to develop a machine learning model to predict 72 h URVs for ED patients with abdominal pain. Electronic health records data were collected from the Chang Gung Research Database (CGRD) for 25,151 ED visits by patients with abdominal pain and a total of 617 features were used for analysis. We used supervised machine learning models, namely logistic regression (LR), support vector machine (SVM), random forest (RF), extreme gradient boosting (XGB), and voting classifier (VC), to predict URVs. The VC model achieved more favorable overall performance than other models (AUROC: 0.74; 95% confidence interval (CI), 0.69–0.76; sensitivity, 0.39; specificity, 0.89; F1 score, 0.25). The reduced VC model achieved comparable performance (AUROC: 0.72; 95% CI, 0.69–0.74) to the full models using all clinical features. The VC model exhibited the most favorable performance in predicting 72 h URVs for patients with abdominal pain, both for all-features and reduced-features models. Application of the VC model in the clinical setting after validation may help physicians to make accurate decisions and decrease URVs

    A global survey of emergency department responses to the COVID-19 pandemic

    Get PDF
    Publisher Copyright: © 2021 Mahajan et al. This is an open access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) License.Introduction: Emergency departments (ED) globally are addressing the coronavirus disease 2019 (COVID-19) pandemic with varying degrees of success. We leveraged the 17-country, Emergency Medicine Education & Research by Global Experts (EMERGE) network and non-EMERGE ED contacts to understand ED emergency preparedness and practices globally when combating the COVID-19 pandemic. Methods: We electronically surveyed EMERGE and non-EMERGE EDs from April 3-June 1, 2020 on ED capacity, pandemic preparedness plans, triage methods, staffing, supplies, and communication practices. The survey was available in English, Mandarin Chinese, and Spanish to optimize participation. We analyzed survey responses using descriptive statistics. Results: 74/129 (57%) EDs from 28 countries in all six World Health Organization global regions responded. Most EDs were in Asia (49%), followed by North America (28%), and Europe (14%). Nearly all EDs (97%) developed and implemented protocols for screening, testing, and treating patients with suspected COVID-19 infections. Sixty percent responded that provider staffing/back-up plans were ineffective. Many sites (47/74, 64%) reported staff missing work due to possible illness with the highest provider proportion of COVID-19 exposures and infections among nurses. Conclusion: Despite having disaster plans in place, ED pandemic preparedness and response continue to be a challenge. Global emergency research networks are vital for generating and disseminating large-scale event data, which is particularly important during a pandemic.Peer reviewe

    Evaluation of a Diagnostic and Management Algorithm for Adult Caustic Ingestion: New Concept of Severity Stratification and Patient Categorization

    No full text
    Background: Caustic ingestion has gained increasing attention worldwide. However, the insight into whether to use esophagogastroduodenoscopy (EGD) or computed tomography (CT) for first-line investigation remains controversial. This study aimed to evaluate a diagnostic and management algorithm that combines EGD and CT for rapid triage. Methods: We established an algorithm for our hospital in 2013, aiming to maximize the benefits and minimize the limitations of EGD and CT. Then, we retrospectively analyzed the 163 enrolled patients treated between 2014 and 2019 and categorized them into 4 groups: A = 3 (1.8%): with perforation signs and directly confirmed by CT, B = 10 (6.1%): clinically suspected perforation but not initially proven by CT, C = 91 (55.8%): initial perforation less favored but with EGD grade &ge; 2b or GI/systemic complications, and D = 59 (36.2%): clinically stable with EGD grade &le; 2a, according to initial signs/symptoms and EGD/CT grading. The morbidity and mortality of each group were analyzed. The predictive values of EGD and CT were examined by logistic regression analyses and receiver operating characteristic (ROC) curves. Results: The outcomes of such algorithm were reported. CT was imperative for patients with toxic signs and suspected perforation. For non-emergent operations, additional EGD was safe and helpful in identifying surgical necessity. For patients with an initially low perforation risk, EGD alone sufficiently determined admission necessity. Among inpatients, EGD provided excellent discrimination for predicting the risk for signs/symptoms&rsquo; deterioration. Routine additional CT was only beneficial for those with deteriorating signs/symptoms. Conclusions: According to the analyses, initial signs/symptoms help to choose EGD or CT as the first-line investigative tool in caustic patients. CT is necessary for seriously injured patients, but it cannot replace EGD for moderate/mild injuries. The severity stratification and patient categorization help to simplify complex scenarios, accelerate decision-making, and prevent unnecessary intervention/therapy. External validation in a larger sample size is further indicated for this algorithm
    • …
    corecore