153 research outputs found

    Predicting Ebola infection: A malaria-sensitive triage score for Ebola virus disease.

    Get PDF
    The non-specific symptoms of Ebola Virus Disease (EVD) pose a major problem to triage and isolation efforts at Ebola Treatment Centres (ETCs). Under the current triage protocol, half the patients allocated to high-risk "probable" wards were EVD(-): a misclassification speculated to predispose nosocomial EVD infection. A better understanding of the statistical relevance of individual triage symptoms is essential in resource-poor settings where rapid, laboratory-confirmed diagnostics are often unavailable. This retrospective cohort study analyses the clinical characteristics of 566 patients admitted to the GOAL-Mathaska ETC in Sierra Leone. The diagnostic potential of each characteristic was assessed by multivariate analysis and incorporated into a statistically weighted predictive score, designed to detect EVD as well as discriminate malaria. Of the 566 patients, 28% were EVD(+) and 35% were malaria(+). Malaria was 2-fold more common in EVD(-) patients (p<0.05), and thus an important differential diagnosis. Univariate analyses comparing EVD(+) vs. EVD(-) and EVD(+)/malaria(-) vs. EVD(-)/malaria(+) cohorts revealed 7 characteristics with the highest odds for EVD infection, namely: reported sick-contact, conjunctivitis, diarrhoea, referral-time of 4-9 days, pyrexia, dysphagia and haemorrhage. Oppositely, myalgia was more predictive of EVD(-) or EVD(-)/malaria(+). Including these 8 characteristics in a triage score, we obtained an 89% ability to discriminate EVD(+) from either EVD(-) or EVD(-)/malaria(+). This study proposes a highly predictive and easy-to-use triage tool, which stratifies the risk of EVD infection with 89% discriminative power for both EVD(-) and EVD(-)/malaria(+) differential diagnoses. Improved triage could preserve resources by identifying those in need of more specific differential diagnostics as well as bolster infection prevention/control measures by better compartmentalizing the risk of nosocomial infection

    Predicting Ebola Severity: A Clinical Prioritization Score for Ebola Virus Disease.

    Get PDF
    Despite the notoriety of Ebola virus disease (EVD) as one of the world's most deadly infections, EVD has a wide range of outcomes, where asymptomatic infection may be almost as common as fatality. With increasingly sensitive EVD diagnosis, there is a need for more accurate prognostic tools that objectively stratify clinical severity to better allocate limited resources and identify those most in need of intensive treatment. This retrospective cohort study analyses the clinical characteristics of 158 EVD(+) patients at the GOAL-Mathaska Ebola Treatment Centre, Sierra Leone. The prognostic potential of each characteristic was assessed and incorporated into a statistically weighted disease score. The mortality rate among EVD(+) patients was 60.8% and highest in those aged <5 or >25 years (p<0.05). Death was significantly associated with malaria co-infection (OR = 2.5, p = 0.01). However, this observation was abrogated after adjustment to Ebola viral load (p = 0.1), potentially indicating a pathologic synergy between the infections. Similarly, referral-time interacted with viral load, and adjustment revealed referral-time as a significant determinant of mortality, thus quantifying the benefits of early reporting as a 12% mortality risk reduction per day (p = 0.012). Disorientation was the strongest unadjusted predictor of death (OR = 13.1, p = 0.014) followed by hiccups, diarrhoea, conjunctivitis, dyspnoea and myalgia. Including these characteristics in multivariate prognostic scores, we obtained a 91% and 97% ability to discriminate death at or after triage respectively (area under ROC curve). This study proposes highly predictive and easy-to-use prognostic tools, which stratify the risk of EVD mortality at or after EVD triage

    Maintenance Therapy With Tumor-Treating Fields Plus Temozolomide vs Temozolomide Alone for Glioblastoma: A Randomized Clinical Trial.

    Get PDF
    IMPORTANCE: Glioblastoma is the most devastating primary malignancy of the central nervous system in adults. Most patients die within 1 to 2 years of diagnosis. Tumor-treating fields (TTFields) are a locoregionally delivered antimitotic treatment that interferes with cell division and organelle assembly. OBJECTIVE: To evaluate the efficacy and safety of TTFields used in combination with temozolomide maintenance treatment after chemoradiation therapy for patients with glioblastoma. DESIGN, SETTING, AND PARTICIPANTS: After completion of chemoradiotherapy, patients with glioblastoma were randomized (2:1) to receive maintenance treatment with either TTFields plus temozolomide (n = 466) or temozolomide alone (n = 229) (median time from diagnosis to randomization, 3.8 months in both groups). The study enrolled 695 of the planned 700 patients between July 2009 and November 2014 at 83 centers in the United States, Canada, Europe, Israel, and South Korea. The trial was terminated based on the results of this planned interim analysis. INTERVENTIONS: Treatment with TTFields was delivered continuously (>18 hours/day) via 4 transducer arrays placed on the shaved scalp and connected to a portable medical device. Temozolomide (150-200 mg/m2/d) was given for 5 days of each 28-day cycle. MAIN OUTCOMES AND MEASURES: The primary end point was progression-free survival in the intent-to-treat population (significance threshold of .01) with overall survival in the per-protocol population (n = 280) as a powered secondary end point (significance threshold of .006). This prespecified interim analysis was to be conducted on the first 315 patients after at least 18 months of follow-up. RESULTS: The interim analysis included 210 patients randomized to TTFields plus temozolomide and 105 randomized to temozolomide alone, and was conducted at a median follow-up of 38 months (range, 18-60 months). Median progression-free survival in the intent-to-treat population was 7.1 months (95% CI, 5.9-8.2 months) in the TTFields plus temozolomide group and 4.0 months (95% CI, 3.3-5.2 months) in the temozolomide alone group (hazard ratio [HR], 0.62 [98.7% CI, 0.43-0.89]; P = .001). Median overall survival in the per-protocol population was 20.5 months (95% CI, 16.7-25.0 months) in the TTFields plus temozolomide group (n = 196) and 15.6 months (95% CI, 13.3-19.1 months) in the temozolomide alone group (n = 84) (HR, 0.64 [99.4% CI, 0.42-0.98]; P = .004). CONCLUSIONS AND RELEVANCE: In this interim analysis of 315 patients with glioblastoma who had completed standard chemoradiation therapy, adding TTFields to maintenance temozolomide chemotherapy significantly prolonged progression-free and overall survival. TRIAL REGISTRATION: clinicaltrials.gov Identifier: NCT00916409

    Setting a baseline for global urban virome surveillance in sewage

    Get PDF
    The rapid development of megacities, and their growing connectedness across the world is becoming a distinct driver for emerging disease outbreaks. Early detection of unusual disease emergence and spread should therefore include such cities as part of risk-based surveillance. A catch-all metagenomic sequencing approach of urban sewage could potentially provide an unbiased insight into the dynamics of viral pathogens circulating in a community irrespective of access to care, a potential which already has been proven for the surveillance of poliovirus. Here, we present a detailed characterization of sewage viromes from a snapshot of 81 high density urban areas across the globe, including in-depth assessment of potential biases, as a proof of concept for catch-all viral pathogen surveillance. We show the ability to detect a wide range of viruses and geographical and seasonal differences for specific viral groups. Our findings offer a cross-sectional baseline for further research in viral surveillance from urban sewage samples and place previous studies in a global perspective

    Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: A systematic analysis for the Global Burden of Disease Study 2015

    Get PDF
    Background: The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 provides an up-to-date synthesis of the evidence for risk factor exposure and the attributable burden of disease. By providing national and subnational assessments spanning the past 25 years, this study can inform debates on the importance of addressing risks in context. Methods: We used the comparative risk assessment framework developed for previous iterations of the Global Burden of Disease Study to estimate attributable deaths, disability-adjusted life-years (DALYs), and trends in exposure by age group, sex, year, and geography for 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2015. This study included 388 risk-outcome pairs that met World Cancer Research Fund-defined criteria for convincing or probable evidence. We extracted relative risk and exposure estimates from randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. We developed a metric that allows comparisons of exposure across risk factors—the summary exposure value. Using the counterfactual scenario of theoretical minimum risk level, we estimated the portion of deaths and DALYs that could be attributed to a given risk. We decomposed trends in attributable burden into contributions from population growth, population age structure, risk exposure, and risk-deleted cause-specific DALY rates. We characterised risk exposure in relation to a Socio-demographic Index (SDI). Findings: Between 1990 and 2015, global exposure to unsafe sanitation, household air pollution, childhood underweight, childhood stunting, and smoking each decreased by more than 25%. Global exposure for several occupational risks, high body-mass index (BMI), and drug use increased by more than 25% over the same period. All risks jointly evaluated in 2015 accounted for 57·8% (95% CI 56·6–58·8) of global deaths and 41·2% (39·8–42·8) of DALYs. In 2015, the ten largest contributors to global DALYs among Level 3 risks were high systolic blood pressure (211·8 million [192·7 million to 231·1 million] global DALYs), smoking (148·6 million [134·2 million to 163·1 million]), high fasting plasma glucose (143·1 million [125·1 million to 163·5 million]), high BMI (120·1 million [83·8 million to 158·4 million]), childhood undernutrition (113·3 million [103·9 million to 123·4 million]), ambient particulate matter (103·1 million [90·8 million to 115·1 million]), high total cholesterol (88·7 million [74·6 million to 105·7 million]), household air pollution (85·6 million [66·7 million to 106·1 million]), alcohol use (85·0 million [77·2 million to 93·0 million]), and diets high in sodium (83·0 million [49·3 million to 127·5 million]). From 1990 to 2015, attributable DALYs declined for micronutrient deficiencies, childhood undernutrition, unsafe sanitation and water, and household air pollution; reductions in risk-deleted DALY rates rather than reductions in exposure drove these declines. Rising exposure contributed to notable increases in attributable DALYs from high BMI, high fasting plasma glucose, occupational carcinogens, and drug use. Environmental risks and childhood undernutrition declined steadily with SDI; low physical activity, high BMI, and high fasting plasma glucose increased with SDI. In 119 countries, metabolic risks, such as high BMI and fasting plasma glucose, contributed the most attributable DALYs in 2015. Regionally, smoking still ranked among the leading five risk factors for attributable DALYs in 109 countries; childhood underweight and unsafe sex remained primary drivers of early death and disability in much of sub-Saharan Africa. Interpretation: Declines in some key environmental risks have contributed to declines in critical infectious diseases. Some risks appear to be invariant to SDI. Increasing risks, including high BMI, high fasting plasma glucose, drug use, and some occupational exposures, contribute to rising burden from some conditions, but also provide opportunities for intervention. Some highly preventable risks, such as smoking, remain major causes of attributable DALYs, even as exposure is declining. Public policy makers need to pay attention to the risks that are increasingly major contributors to global burden. Funding: Bill & Melinda Gates Foundation

    Measurement of the cross section of high transverse momentum Z→bb̄ production in proton–proton collisions at √s = 8 TeV with the ATLAS detector

    Get PDF
    This Letter reports the observation of a high transverse momentum Z→bb̄ signal in proton–proton collisions at √s=8 TeV and the measurement of its production cross section. The data analysed were collected in 2012 with the ATLAS detector at the LHC and correspond to an integrated luminosity of 19.5 fb−¹. The Z→bb̄ decay is reconstructed from a pair of b -tagged jets, clustered with the anti-ktkt jet algorithm with R=0.4R=0.4, that have low angular separation and form a dijet with pT>200 GeVpT>200 GeV. The signal yield is extracted from a fit to the dijet invariant mass distribution, with the dominant, multi-jet background mass shape estimated by employing a fully data-driven technique that reduces the dependence of the analysis on simulation. The fiducial cross section is determined to be σZ→bb¯fid=2.02±0.20 (stat.) ±0.25 (syst.)±0.06 (lumi.) pb=2.02±0.33 pb, in good agreement with next-to-leading-order theoretical predictions

    Operation and performance of the ATLAS semiconductor tracker

    Get PDF
    The semiconductor tracker is a silicon microstrip detector forming part of the inner tracking system of the ATLAS experiment at the LHC. The operation and performance of the semiconductor tracker during the first years of LHC running are described. More than 99% of the detector modules were operational during this period, with an average intrinsic hit efficiency of (99.74±0.04)%. The evolution of the noise occupancy is discussed, and measurements of the Lorentz angle, δ-ray production and energy loss presented. The alignment of the detector is found to be stable at the few-micron level over long periods of time. Radiation damage measurements, which include the evolution of detector leakage currents, are found to be consistent with predictions and are used in the verification of radiation background simulations

    Measurement of the correlation between flow harmonics of different order in lead-lead collisions at √sNN = 2.76 TeV with the ATLAS detector

    Get PDF
    Correlations between the elliptic or triangular flow coefficients vm (m=2 or 3) and other flow harmonics vn (n=2 to 5) are measured using √sNN=2.76 TeV Pb+Pb collision data collected in 2010 by the ATLAS experiment at the LHC, corresponding to an integrated luminosity of 7 μb−1. The vm−vn correlations are measured in midrapidity as a function of centrality, and, for events within the same centrality interval, as a function of event ellipticity or triangularity defined in a forward rapidity region. For events within the same centrality interval, v3 is found to be anticorrelated with v2 and this anticorrelation is consistent with similar anticorrelations between the corresponding eccentricities, ε2 and ε3. However, it is observed that v4 increases strongly with v2, and v5 increases strongly with both v2 and v3. The trend and strength of the vm−vn correlations for n=4 and 5 are found to disagree with εm−εn correlations predicted by initial-geometry models. Instead, these correlations are found to be consistent with the combined effects of a linear contribution to vn and a nonlinear term that is a function of v22 or of v2v3, as predicted by hydrodynamic models. A simple two-component fit is used to separate these two contributions. The extracted linear and nonlinear contributions to v4 and v5 are found to be consistent with previously measured event-plane correlations
    corecore